Episode Transcript
[00:00:00] SAM welcome to AI Today. I'm your host, Dr. Alan Badot. And this week we're going to have a little bit of a hodgepodge of topics all focused around your identity, protecting your identity, looking at ways that you can, you know, make sure that even when you go online that you are protecting yourself. Because, you know, with AI and the advancements that we are just continuing to see, you know, there's, there's so many ways that folks can, you know, get access to you, they can get your information, all with a few strokes on a keyboard. And if that's not concerning to you, then that's great, good for you.
[00:01:14] But for many of us, it is very concerning. And so, you know, as we, as we start to peel the onion back, then, you know, there are going to be some things in there that we're going to, that we're going to look at, we're going to talk about and we're going to try to explore.
[00:01:32] And so whether you're in the U.S. whether you own a small business, whether you're doing, you know, whatever that is, go to a foreign country, you know, we're going to cover a lot of topics today, all around you, you know, your identity and fraud.
[00:01:50] And so, you know, really, quite honestly, one of the best places to start is really around social media.
[00:01:57] You know, when you think about it, the amount of data that goes in and out of these social media platforms every single day is huge.
[00:02:10] It is a, you know, a, a wonderful thing if you're Facebook or Instagram or, or X or whoever that is, and you're trying to train your models, because all of that data, everything you say is going in and out and in and out.
[00:02:29] Everything you do, same thing.
[00:02:32] And a lot of times it can be pretty much put out there for the entire public to see.
[00:02:39] You know, that's why I tell people, you know, it won't take me very long to, you know, for myself or even my AI, if I wanted to find folks. And, you know, it wouldn't take very long for it to be able to pinpoint almost your location, because that's how much data is out there.
[00:02:56] You know, it wasn't, you know, too long ago that when we had an Apple Watch, we were concerned about wearing it on a government site or to a government installation or building because, oh, it's going to give your identity away. And now there's so many things that can just do it that, you know, we are really making it much easier for folks to find us, including the bad guys.
[00:03:23] And they're the ones that, they're the ones that are really, you know, taking advantage of, of AI to scales that we haven't, we haven't seen before.
[00:03:34] You know, last week I was in, you know, Las Vegas, had a great trip.
[00:03:39] I actually was able to once withdraw from the building fund that was out there. But you know, when you sit down and you're, you're playing in the casino and you look up and you see, you know, those, those orbital cameras that they have, the 360°, you know, cameras that are there and then you try to count them, you, you lose count pretty quick because they're all over the place.
[00:04:06] And then you realize as soon as you walk on the property, they see you all the way to reception, all the way to check in, all the way to your room. They can track the whole thing.
[00:04:21] It's pretty, it's pretty impressive. You know, actually, and one of the, one of the cleaning ladies was telling me a story. She was saying that the, you know, the, the previous guest that was standing stand in my room got a phone call because there was, there was a sensor in the room that caused the, the smoke it detected. Somebody was smoking in there. I mean, and so it's not just cameras, it's other sensors that they can also use to get information from you.
[00:04:55] And we see it, we see it everywhere. No, and I think, I think we're starting to become a little bit immune to that.
[00:05:05] There's good points about that, there's bad points about that. You go to the airport, they may have the facial recognition, you know, set up. Oh, it's great. I don't need my camera, you know, my phone, I don't need my, my boarding pass. They take my picture and you walk through.
[00:05:20] Okay, that personally doesn't bother me because, you know, I supported the government and worked in the government for almost 25 years.
[00:05:28] So I'm used to that kind of security. But a lot of folks are not.
[00:05:34] And you know, all this information though, you know, we talk about our digital identity and trying to protect our digital footprint online.
[00:05:46] All these different sources of data are allowing folks to build profiles of us. Now maybe not just an individual person, depending on who you are, but all of us at the same time. So as you're getting these mass emails potentially, they look real.
[00:06:06] They look pretty good. They look a lot better than they used to.
[00:06:10] They're using AI to generate those things.
[00:06:16] And that's really, that's really the scary part because it continues to improve, it continues to look more and more realistic.
[00:06:26] And you Know, the, the cyber criminals have such an easy time of pulling, you know, that kind of data that it's, it's, it's, it's actually pretty amazing what some of them are able to do. You know, I've got, I've got bots that are out searching the Internet every night just to see, you know, if there's any new information that has dropped about me. I've had my identity stolen, I don't know, five, six times. A couple of foreign countries have. It is what it is. Right? But, you know, now I just, I'm all searching the dark web, you know, as well, but I have my AI out searching for hits on myself, family, friends, just to, just so I can, you know, kind of get ahead of some of these things.
[00:07:11] Whether it's, you know, a mortgage company that left your information out by the dumpster and somebody grabbed it, or, you know, you know, an email list or whatever, you know, our, our identities are kind of being kind of being hijacked.
[00:07:32] And so, you know, we've gotta, we've got to do everything that we can to try to protect those.
[00:07:38] Now, some of the things that we're going to talk about in the next couple of segments, we're going to talk about, you know, AI scraping, you know, what does that mean?
[00:07:48] Does, you know, and, you know, is it, is it legal? Is it allowed? Is there gray area, you know, those kind of things.
[00:07:55] You're also going to hear me say, you know, things like, you know, AI crawlers, similar to scraping deep fakes, phishing, synthetic IDs, all of those kind of things that are allowing folks to spoof us.
[00:08:12] And that can be, that can be pretty interesting.
[00:08:15] And whether it's, you know, in the US or in a foreign country, it really, it really, it doesn't matter because there are breaches taking place all over the world.
[00:08:29] Doesn't have to be just in the US Even though Europe and some other countries have stronger AI policies, that doesn't mean that those kind of fraudulent schemes and things like that don't happen.
[00:08:43] And what's even a little bit scarier in some cases, you know, if you go around and I use the casino as an example, because that is probably the densest set of camera systems that you're probably find in the U.S. you know, there's probably some stores that'll rival it and, you know, maybe some, some parts of cities and some office buildings and those kind of things, but it's not everywhere you go outside.
[00:09:14] And there are some countries that do that.
[00:09:17] There's Some countries that you go to, it's guaranteed they're going to get your, they're going to get your profile, they're going to get your face, they're going to have it and they're going to do whatever they want to do with it.
[00:09:31] And so those are some things to think about as we, as we talk through some of these next segments in that, you know, as our, as our ability to pretty much travel where we want, you know, go to places that we want, it's getting easier to spend money in different countries with the different, you know, types of currency and as they go potentially to a digital currency, then they'll be able to track every single thing that you, that you purchase also.
[00:10:00] But it's getting easier to do those kind of things and our general privacy is, is pretty much gone when it comes to, you know, as soon as you leave the door of your home, that, that's, that's pretty, that, that, that's pretty much your only, your only place that you can guarantee that somebody's not going to capture your face, somebody's not going to see you, somebody's not going to do something and be able to, you know, get some information about you. And so, you know, the next segments we are going to, to really deep dive into some of the social media stuff as well as, you know, really around looking at the different types of fraud that are out there and really what that threat landscape looks like. And it doesn't matter if you're a small business, it doesn't matter if you're, you know, you stay home all the time, you travel, whatever that is, you know, it's important for you.
[00:11:03] So, you know, I want you to stick around because this is one of those serious shows where I'm going to try to help give you some, some advice and point you in the right direction and prepare you for some of the things that you might see as you go overseas or you travel or you get online.
[00:11:19] So stay with us, come back and we'll, you know, in a few minutes and we will talk about know, fraud and you know, how to protect yourself. Stay, stay right with foreign welcome back to AI Today. I'm your host, Dr. Alan Badot and we are talking about our digital identities and we're talking about how to prevent them from being stolen, taken, you know, really accessed without our, our permission. And you know, in the last segment we kind of, we, we just, it was really an intro to talk about the different types of, of, you know, fraudulent activity that, that's out there, ways that you know, you don't anticipate even, you know, being on camera and somebody taking your picture.
[00:12:35] So, you know, that's the challenge that we, that we have. You know, I gave the example and I was in Las Vegas and had a great trip in Las Vegas. And you know, you look around and you see hundreds of cameras in the ceiling.
[00:12:47] And you know, that's, that's just how it is. And I, I accept that. You know, I, I love to play, so it is what it is. But, you know, think about, think about if you're a, you know, you're going to a foreign country, think about if you check into a hotel and there happens to be, you know, a camera in your, in your room and why are they doing those kind of things or, you know, it's all, all about, you know, we'll leave the government state alone. But, you know, everything else is really about trying to either get data, get your picture, you know, blackmail more. There's so many different opportunities for, you know, criminals that, you know, that for the opportunities that they have today that they didn't have previously, or if they did have them, they had to work a lot harder in order to be successful with AI. That's gone.
[00:13:47] Because what it has allowed them to do is really supercharge their, their schemes.
[00:13:54] You know, we'll use the, the dmv. That's a, that's a very popular one I've gotten in. You know, a lot of folks that I know have gotten it.
[00:14:03] You know, when you see that the first time, you're probably thinking, holy cow, I don't think I have anything going on at the dmv. I think I'm, I think I'm all right.
[00:14:11] Or you get a call from the sheriff's department and happens to, happens to be a number from the Philippines.
[00:14:19] Good indicator, right? But it just makes it so much easier for them to do that because they can go out, they can get all that data, they can train their AI and then they can turn it loose.
[00:14:29] You know, it's, it's, it's really synthetic, you know, identity fraud and that's, you know, that allows criminals to just combine real world data with, with fake information to create some new identity.
[00:14:47] And that's, that's kind of scary, you know, whether it's a deep fake, you know, some of the things that some folks went through. I know everybody probably heard about what was happening to, you know, to Taylor Swift. Very unfortunate. Some of the deep fakes that were, were out there of her. But it's, you know, really any celebrity is, is really, you know, it's so easy for them to get that information and generate that kind of stuff.
[00:15:14] And I was reading earlier that in 2023, just 2023, that's cost, you know, the, the consumer about $35 billion for those types of activities. $35 billion in fraudulent activities. And that was in 2023.
[00:15:39] And where AI is today.
[00:15:42] That's why it's so much easier for, for folks to be able to do that. It's just come such a long way in a few years.
[00:15:50] You know, you block one, you know, text that you get from some recruiter that you never applied for a job that wants you to get be a millionaire by only working for 15, you know, hours a week. Okay, yeah, that's, that's really gonna work.
[00:16:05] But the truth is, you block one, there's another that's going to come behind it, and then another and another and another.
[00:16:14] And so how do you, how do you really stop that?
[00:16:18] You know, how do you tell this gamer is trying to scam you?
[00:16:24] Travel sites are great ways for, you know, scammers to be able to, to take advantage of you because it's so easy for them to make a website that looks like a hotel or a, a destination. And let's adjust the pictures a little bit. Maybe we'll AI the pictures a little bit. Right. And by the time you know it, you are, you know, staying at a hotel that you had no intention of staying at in a room that you thought was a little bit different, thought was a little bigger or whatever that is. And, and it's, it's, it's just so easy now to, to do those kind of things. And you know, I know a lot of folks have experienced some of this stuff even when they, you know, they travel abroad because, you know, you're looking at a lot of really small businesses, mom and pop owners, whether it's a, you know, a verbo or whether it's something else.
[00:17:25] You know, a lot of these, a lot of these folks are having hard times distinguishing themselves from the scammers.
[00:17:33] And that is, that's very scary.
[00:17:38] Booking.com, it's another one that I saw.
[00:17:42] They see a 500% spike in travel scams. And when, you know, the summer hits and people are going over overseas, that's amazing.
[00:17:54] 500%.
[00:17:58] That's, that's, that's, that's, that's scary.
[00:18:02] You know, it used to be you worry about, you know, somebody taking your picture or, you know, potential. You'd hear stories about, oh, you get in a cab and, you know, somebody else jumps in the cab with you and off you go, you know, those kind of things. But now it's more powerful and you're worth more.
[00:18:22] If they can get all your digital identity and all your digital information, bank accounts, credit card statements, all that stuff, you're worth more to them.
[00:18:31] So how do you, how do you protect yourself?
[00:18:35] How do you take a step back, reevaluate some of the things that you're doing and just get smart on, you know, where that next trip is going to be, who you're going to use, what, what tools are you going to use to try to evaluate those kind of things? What questions are you going to ask that's going to maybe send up a red flag?
[00:18:59] And you know, that's, that's really, that's really key.
[00:19:03] You know, everybody always says the same, the same thing, right? Only go to verified sites.
[00:19:11] Well, with AI, it's kind of hard to tell with those URLs. Now, is it a verified site or not?
[00:19:21] Is it suspicious? Maybe.
[00:19:24] So that's where you might want to take, you know, the, the URL and drop it into a, a chat GPT or something like that to see what sort of information it can find.
[00:19:37] Because with Google, I mean, now, you know, it's gotten a lot better with Gemini and stuff, but now it's the same sort of thing. You drop that info in there and you're not 100 sure, is it real or is it not real?
[00:19:50] That makes it even harder.
[00:19:53] And you know, you, phone calls, those kind of things.
[00:19:59] You know, the motive of a criminal is really, they want to get paid and they want to get paid quickly.
[00:20:09] I mean, that's pretty simple.
[00:20:12] So if you get a phone call, somebody demanding that you send them a check, and it could be the, a sheriff, it could be, you know, a fake or whoever it is.
[00:20:24] You know, you've got to ask yourself, one, I've never talked to this person. Do I know them? Probably not.
[00:20:32] Two, why are they calling you if you don't know them and you know you didn't do anything wrong?
[00:20:41] Yeah. And then three, oh, you got to pay them by tomorrow.
[00:20:45] Okay, that's, that's a pretty good indicator. But there, there are now sites that allow you to track some of these trends, you know, to, to get out there and to, you know, really try to, to get the word out that something is a scam.
[00:21:04] That's really, that's really some of the most power that we have now is being able to do those Kind of things, you know, wearable glasses like mine, I can, I can take pictures with these.
[00:21:21] I don't, I do not do that, but I could.
[00:21:26] And then, you know, you're walking up, you're having a conversation. I can record the conversation, I can listen, I can get all that information and goes back. And it trains, trains an AI on that kind of information.
[00:21:41] But from my perspective, like everything else, when it comes to technology, transparency is key.
[00:21:49] You know, I could do that, I could tape stuff.
[00:21:53] I wouldn't do that without asking somebody's permission because I think that's just the way that we have to do it.
[00:22:02] But it's very easy to do those kind of things.
[00:22:07] You know, all I got to do is press the button right over here.
[00:22:10] Boom. It'll take a picture.
[00:22:13] And, you know, if I was doing something like this and wanted to hide it or something, I could do that. And you probably wouldn't be able to tell.
[00:22:21] And, and that's the, that's the kind of things that we have to worry about from a technology perspective.
[00:22:27] And when you go to foreign countries that don't have the same kind of laws that we have, then it really starts to make you question maybe your travel choices or, you know, maybe your, your, you know, your overseas college, you know, internship that you're doing, things like that, you know, because there's so many different ways now to capture your information that you just have to be much more diligent. You've got to be much smarter and, you know, you've just got to pay attention to some, some potential, you know, ways that you can throw some of these things off.
[00:23:13] It's probably why you're thinking, he's got his, he's got a interesting, interesting pattern jacket on. Well, we'll talk about that when we come back.
[00:23:23] You know, after, after a couple of, you know, advertisements from our sponsors. But I think you're going to be interested in hearing the next segment about can I fool a facial recognition camera just with what I have on?
[00:23:40] We're going to talk about that. We'll deep dive into it. Stay with us. We'll be back in here in a few, few minutes.
[00:23:54] FOREIGN welcome back to AI Today. I'm your host, Dr. Alan Bedot. And this week we are talking about protecting your identity.
[00:24:24] We're talking about a little bit of fraud in there and, and really just trying to, trying to get smart about the different ways and how easy it is for folks to be able to figure out who you are, you know, get your identity, get your information and get your picture.
[00:24:45] Yeah, yeah. I was talking about how, you know, went to Vegas, had a great trip.
[00:24:51] I was on hundreds of cameras during, during that time.
[00:24:56] And.
[00:25:00] Yeah, that's a lot. That's a lot of. A lot of photos of me. Now. Thank goodness we're in, you know, the country that we are where, you know, they just can't dump that stuff out on the Internet, but folks are doing it already.
[00:25:17] Number of selfies, number of, you know, you know, I'm at the beach, I'm here, here's my house, here's this, here's that. Here's another picture of something else. You know, it's pretty. It's pretty scary. If you would sit down and think about it.
[00:25:35] When somebody can get your information that easy and, or they can take a picture of you off the Internet and do a comparison and use some of these online tools that do facial recognition, they're not great, but there are some that are significantly better, I can promise you that.
[00:25:54] But, you know, then you start to think, though, you know, I'm going on vacation. I'm going to a foreign country.
[00:26:01] When's the last time you asked yourself, how many cameras am I going to be on when you go on vacation?
[00:26:09] Now, if you're a creator, that's, that's a little bit different, but if you're just going there and trying to have a good time and, you know, you don't necessarily mind or even think about your face being, you know, plastered all over a foreign country's camera system, well, you know, that's great, but it's happening all over the place.
[00:26:33] You know, we know about China and their mass surveillance programs that they have. They can track citizens as well as visitors with, you know, the same sort of precision.
[00:26:47] They can track you from, you know, the airport to your hotel.
[00:26:52] That's.
[00:26:54] That's pretty interesting.
[00:26:56] And it does make you think, though. Oh, geez, you know, your privacy. Well, you don't really have privacy when you go to a foreign country necessarily. And it's not just, you know, China in a lot of cases.
[00:27:08] It's, you know, really any country that has that kind of investment in their surveillance programs.
[00:27:19] You've gotta. You've got to think about that because you are being cataloged. Your face is being put in a catalog and being tracked without your knowledge.
[00:27:29] You don't know what they're using it for. It could be training AI models, could be a lot of different things. And so you've gotta, you've gotta be mindful of, of that.
[00:27:39] Now here's the other thing though, with AI, you know, it used to be it could do one image, right? And you'd have to really focus in on that, that, that, that one face that it was gonna, that it was gonna scan and maybe it'll take a couple of minutes.
[00:28:01] So hopefully they're sitting down. That's not the case anymore. That is not the case.
[00:28:08] You know, AI can, can really scan huge crowds now.
[00:28:15] You know, you see those green, green boxes? Every once in a while that'll pop up.
[00:28:20] That's the labeling box. They're capturing your, your image and they're going to label it.
[00:28:25] They're going to put some characteristics around it. With me, it would be white, bald guy, you know, mid, you know, mid-40s maybe, something like that. They'll put something like that on there. They will tag it and then they'll use it to train their models.
[00:28:42] That's a concern, right?
[00:28:45] That kind of stuff's not supposed to happen, but it does.
[00:28:48] You go to a foreign country, you can't even, you can't even question it. It's just going to happen.
[00:28:56] And, you know, it does bring up a question though, doesn't it?
[00:29:02] Can you fool them?
[00:29:05] Well, you know, I, I get that question all the time. That's why I got this jacket on. I'm just going to wear a bright pink shirt with a pattern on it. And it would, it would definitely clash something, something fierce. It would, yeah, it would, it would be scary with the, the shine on my head.
[00:29:26] But there are some, some people say that, you know, you can fool AI and you can fool these cameras with, with just, with the clothes that you wear, the glasses that you have on.
[00:29:40] Well, you know, let me, let me just, let me just put it this way and we can talk a little bit about that. You know, it really started around Covid. If you remember, the big push was, oh, I want, I gotta open my, I need to use my face to open my, my iPhone.
[00:29:58] And then what did Apple do? They released a, a little add on that, you know, a little update that now it can recognize your, your face when you have a mask on.
[00:30:12] Oh, yeah. I can still open my, my, my iPhone without having to take off my mask, right?
[00:30:18] Well, okay, no, that technology is, it's something fierce, you know, and one of the terms that we use in, in the AI field, folks trying to, to fool, you know, AI with the clothes that they have on, it's adversarial fashion.
[00:30:43] I was trying to, I was, I was going to embrace the fashion anarchy ism. And you know, to try to try to show you all some stuff. I just couldn't do it. I'm sorry. I could not bring myself to, to wear something that was so off, not right that it would, it would disrupt AI. But really the thought is this.
[00:31:07] It's, you know, AI is trying to, especially the, you know, object characteristic type, you know, AI, where it's just trying to characterize something and capture an image, those kind of things.
[00:31:20] It's looking for patterns, it's trying to detect different patterns.
[00:31:25] And the thought is, is if you wear something that is so crazy looking that you are going to disrupt the patterns, you're going to disrupt the training that the AI has had and you're going to be able to fool it.
[00:31:42] You know, that was the case a year ago, maybe 18 months ago, depending, you know, of course, if, you know, with some of those older models.
[00:31:52] But here's what I'd say.
[00:31:57] If you think about the advancements that AI has made in the last three months alone, I cannot tell you with no questions and or no uncertainty that adversarial fashion works anymore.
[00:32:19] There's so many other things that it's using now that it's looking for, it's looking for features, it's not looking just for, you know, patterns now, it's looking for a lot of other things. You know, really a, a multimodal type approach to, you know, how it's trying to characterize you.
[00:32:39] You know, if you're larger, you know, it looks at those kind of things. If it's, you know, certain features that you may have, ear size, you know, no size, you know, distance between your eyes. There's a lot of things that they're using now that it's not going to fool AI if you've got on a crazy jacket with a really crazy shirt. It just doesn't work that way.
[00:33:07] And that kind of, that kind of information is, is really gold. Your face is really the driving factor of, you know, a lot of things that you can do. Think about it if, think about the trouble we had with the real id, you know, how many, how many folks kept trump, you know, just putting that off because, you know, their real ID might not work in a system or it's just a pain in the butt. They didn't want to pay extra for it. Whatever, you know, that's your access, that's your ticket to do a lot of things.
[00:33:41] And the easiest way to, you know, to spoof somebody is to use your face, pretend you're not somebody, you know, or pretend you're somebody Else, right?
[00:33:51] And you know, that's a, that's, that's really a challenge.
[00:33:57] And so there are some things, though, that you can do that, you know, that I can, that I can give you a little bit of advice around. Right. And you know, the first one is if you can limit, you know, really the number of, of clear photos of you just facing the camera that you share online.
[00:34:20] Meaning, like, you don't want to have a whole bunch of profile pictures where it's you and it's, you know, like me right here, this is, this would be my profile picture.
[00:34:29] Snap the picture and I do a whole bunch of them with a lot of different things.
[00:34:34] If you can prevent that as much as possible, it's a lot harder to train your, with your data.
[00:34:42] That's one way that you can, that you can protect yourself now. You could, you could buy helmet and wear a helmet all the time. You can try a whole bunch of different glasses. It's not gonna work with that because you're really not necessarily changing your facial features. All you're doing is putting on something else that might add to them. And in some cases you're just giving the AI more data that it can train on.
[00:35:06] Definitely pro, you know, probably not what you had in mind, but some other things. You know, when you're going to travel overseas, you know, I really recommend to folks, figure out where you're going, look online, you know, find out, you know, what kind of surveillance environment is it.
[00:35:28] If it's something that you just don't feel, you know, like being on camera 24 7, 365 or whatever your plans are from that perspective, then, you know, look that kind of stuff up. That's important.
[00:35:43] And then, you know, really, you know, there are things that you can put on your phone also that make it a little bit easier to, to scramble some things.
[00:35:58] You know, you can, you know, you've seen these pictures of, of people online and their face is scrambled.
[00:36:08] Maybe it's gone through one or two of those, you know, really, those encryption like, you know, type algorithms. And those are, those are pretty good.
[00:36:18] So when you put your face online, it's scrambled. Put something online that could give, give you away, it's scrambled.
[00:36:28] That's a good start because even if somebody scrapes your, your data, they're probably not going to want to use your, your images because they can't see your face. The, doesn't help them, doesn't do, doesn't do them any good when it comes to that same sort of thing with, you know, any Kind of family photos or, you know, anything like that where, you know, your, your family could be, you know, identified, you know, as well look at those things.
[00:37:01] You know, how many times have you. I know I do it all the time. Not, not on purpose all the time either, but it's just, you know, sometimes you'll just accidentally be in a photobomb, right? Oh, you know, you're, you know, and it's easy for me to see because it's usually in between two people with a lot of hair and I'll have glasses on, right. And so they can see that. Oh, yeah. You know, and then somebody tags you, then it's over.
[00:37:23] Right. So just try to, try to be polite and don't get in other people's photos.
[00:37:29] That's a good start.
[00:37:32] Now, there's some other things too, but really just the reality of it is if you are online, if you've got a profile out there, those are some of the things that you just want to try to minimize.
[00:37:47] And that's going to be your best defense. Because it didn't work with masks, right?
[00:37:53] Unless you want to get a helmet and wear a facial mask over that. It's really the only way that you're going to be able to protect yourself from having your face, an image taken.
[00:38:06] But if you do have a helmet, they're probably gonna, you're gonna be on somebody's radar a lot faster anyway.
[00:38:13] So don't get a helmet. It's not going to help you.
[00:38:17] So, you know, be smart and, you know, don't think that you're going to be able to fool these, these AI algorithms. And especially the longer you wait, the less likely it is that you're going to be able to fool them. So stick around. We'll be back after a couple of short messages from our sponsors and we will close out the show by talking about, you know, some other types of fraud and ways that you can protect yourself. So we'll be right back.
[00:39:15] Welcome back to AI Today. I'm your host, Dr. Alan Bedot. And we have been talking about fraud. We've been talking about protecting our identity.
[00:39:24] Last segment, we talked about trying to fool AI with the clothes that we wear and quite honestly, really doesn't. It doesn't work. It's not consistent. Right. You're not gonna be able to do it all the time, you know, early on, that's different.
[00:39:45] The algorithms today, the amount of training that they've gone through, it doesn't, it doesn't matter if you've got the ugliest shirt on in the world, it would probably recognize that you have the ugliest shirt in the world while it recognizes you at the same time. Right? So, you know, think about, think about those, those kind of things. Think about when you travel abroad, the ramifications of, of doing that today.
[00:40:13] Because, you know, of course, everybody wants the world to be safe.
[00:40:19] Just some countries go about it a little bit different than others.
[00:40:24] And if there's a camera system out there now, I can almost guarantee that there is an AI somewhere in that process that's using that information and then using that data.
[00:40:41] You know, we used to say, oh, you know, the ghost in the machine, right? And look that term up if you're, you're too young and you don't recognize that we always talked about the ghost in the machine.
[00:40:52] Well, now we have a ghost in the camera. We've got a ghost with, you know, the phone. We've got ghosts all over the place.
[00:41:01] And AI is, is, is using that information.
[00:41:05] It's watching, or it's using cameras to watch you.
[00:41:10] It's using a mouse and a computer to track you online.
[00:41:18] Could be using your information from a hospital, potentially, or a bank or wherever that is just so it can understand. No. And in some cases for good things like, oh, maybe it's time to reorder something, or for bad things, maybe it's, they're trying to spoof you and get, you know, get your records, get information, get money, whatever that is.
[00:41:47] But you've got to, you've got to start to take a stand and you've got, you've got to think about these things now as you, as you move forward, you have to control your data.
[00:42:02] And I know this is going to pain a lot of people, but you've got to start to share maybe a little bit less than you do, especially on social media sites.
[00:42:12] You know, privacy, Privacy is important. Privacy matters.
[00:42:16] Think about that post that you put up there with your, your picture on it.
[00:42:23] Maybe you'll not do quite as many as you were doing before because that info is being used.
[00:42:32] You've got to make sure that you are securing your concert, you're securing your family's accounts, you're, you're really monitoring all sorts of, of and different types of information that's out there. You know, that's why I have my bots that go out and search all these other places, you know, starting at midnight and I get a report at, you know, it generates it at 6:00am I see, oh, did my Social Security number show up on X, Y, and Z site this time or did something else show up? My credit card number or whatever that is on a, you know, the deep web or the dark Web, it's, it's making sure that, you know, I'm doing everything possible to try to, you know, manage my credentials, manage my digital identity. Same thing for, you know, everybody else.
[00:43:22] You've got to do that.
[00:43:27] Images, though, that's a new level.
[00:43:30] You know, I don't think, I don't think folks really saw the impact that it was going to have on, you know, something this quick.
[00:43:40] If you think about it. I mean, there's, there's ways AI can identify, you know, so many different things now and almost, almost in real time.
[00:43:50] That's one. It's impressing from a technology perspective and I love it and I use it and I try to do a lot of things, different things with it. But then two, you know, as a, as a spokesperson who wants, you know, transparency and things, it's also very concerning.
[00:44:08] You've got to make sure that you're using tools, you know, like Glaze is a. Glaze is a really good one. And Nightshade, I think it's Night Shade. Yeah, I'm pretty sure it's Nightshade. Two very good ones that you can, you know, cloak your identities with.
[00:44:28] And, you know, it's important to be able to do that. But the reality is, is as we become more advanced in AI, those cloaking technologies aren't going to work. So we're going to have to come up with something else.
[00:44:41] Same thing with new tools. Whether it's, you know, quantum computing, which is not too far, hopefully down the road, and some other technologies that are out there.
[00:44:51] Again, it's all about staying ahead of the technology or at least understanding what the ramifications of those technologies as they use AI, how it can impact you, how it can impact your business.
[00:45:08] You know, that kind of, that kind of stuff is, is very, very important.
[00:45:14] And there's, of course, nothing wrong with, you know, protecting yourself and the surveillance, you know, necessary for that. Heck, I've got 12 cameras around my house, pretty much every angle that you can think about.
[00:45:30] But I also don't sell that information. I don't put it online either.
[00:45:35] So, you know, that's the, that's the important thing. So you've got to do a better job just in general protecting yourself and being knowledgeable about the situations and the, you know, the places that you're traveling or the information that you're just in general putting out there, because you've Gotta, you've gotta stay informed. You have to.
[00:45:57] AI is changing so fast. So many communities are, you know, trying to do so some different things.
[00:46:05] And they're doing things, though, that probably don't have a lot of teeth in it.
[00:46:11] And that's, that's, that's tough.
[00:46:13] That's tough because, you know, as the laws change, the technology changes, the risk changes, your risk profile changes. You know, it used to be it would take a while for, for those types of things to happen. Now it can happen in a couple of weeks.
[00:46:27] Ooh, somebody released a new model, Somebody else released a new model. Now it does facial recognition stuff or, you know, those kind of things. It's happening at speeds that we are not used to.
[00:46:38] And so you've got to be more prepared.
[00:46:40] You've got to, you know, think about your security. You've got to think about the sources that you're, you're putting information up there on. You've got to think about software and other technology that can protect you. And you've got to think about privacy and making sure that your privacy are all set and they're up to date.
[00:47:00] You know, if I, I've told a lot of people that, you know, if you just wide open and, and anybody can get to your site, well, they can get to all your photos and they can use that against you if they really wanted to, unless you have countermeasures and do stuff like that against it.
[00:47:17] So, you know, just, just think about that as you snap that next selfie and you put it up on the, on your, your favorite social media platform.
[00:47:29] You know, are you, are you giving out more information that you really don't, you do, you just don't want out there.
[00:47:36] We know, we know. AI powerful tool.
[00:47:40] That's the whole point of this show, for good things, but also for bad things.
[00:47:47] And the more you understand how your data can be obtained, how your information can all just be collected and gathered and, you know, put into a giant data lake and then used however somebody else wants to use it, maybe without your permission or maybe with it.
[00:48:07] That is going to drive a lot of the things that we do online, whether it's traveling, booking travel online, whether it's shopping online, whether it's sharing information online. You know, all of those things become, become very important. That's why a lot of these social media firms have stopped. You know, they've got bot protections on there.
[00:48:29] They've got things on there that say that, you know, my site can't be scraped. If you scrape it, then, you know, I can hold you accountable, you know, those kind of things. And I think we are taking some smart steps. We're going in the right direction. But you cannot rely on somebody else to create a law that is going to protect you.
[00:48:52] You've got to rely on yourself, do the steps that you need to do to protect yourself.
[00:48:59] And then, you know, at the end of the day, that's how, you know, you're going to be able to say, you know what? I wasn't. I wasn't part of this cyber attack, or I wasn't part of this, you know, fraudulent scheme, you know, because you took the initiative, you did what you felt you had to do, and you did everything that you could to try to protect yourself.
[00:49:22] That is. That is a. A very powerful message, powerful feeling, you know, as well, because if you think about it, you're not trying to keep things secret, right?
[00:49:36] You're just trying to make sure that what you choose to share, and quite honestly, who you share that with is your choice, and it's not somebody else's choice.
[00:49:49] And especially it's not somebody else's choice when they don't even have your permission to do that.
[00:49:55] So let that sink in and let's be a little bit smarter. Let's protect ourselves. And I hope everybody enjoyed the show. We'll be back next week with some great AI content. We'll talk about some new technologies that are out there, and stay safe, be prepared, and we'll see you all next week.