Nonprofits - This security-focused educational webinar was presented by Erich Kron, a Security Awareness Advocate at KnowBe4!
Social Engineering is a practice we use almost every day of our lives. It is apparent in how we interact with our families, our friends, strangers and even those coworkers we don't really like. It's really just the practice of dealing with other humans.
By studying these interactions, attackers can become very adept at using these skills to manipulate people into actions that benefit them. Phishing, smishing, vishing are all tools of the trade that attackers use. The psychology used in these attacks to bypass critical thinking is becoming more and more advanced. By leveraging techniques like focus redirection and exploiting the way our brain filters can be tricked in to perceiving a different reality, attackers are outpacing our best efforts to defend ourselves. We do know that throwing money at a problem doesn’t make it go away. Social engineering methods and the cyber criminals behind the attacks are furiously innovating.
Fear, anxiety and outrage are all being used to spread ransomware and other types of malware, scam people and organizations out of money and disrupt business. It’s no wonder that social engineering and phishing are the most common way that successful breaches get started.
This session will look at the things social engineers use to trick users into performing the kinds of actions that lead to security breaches and ways to identify and counteract these attacks. It will also discuss recent real-world attacks and the social engineering tricks that made them effective.
The Perception vs. Reality Dilemma
Psychology behind the attacks
Identifying and developing defensive practices
02:29:25 Erica Woods: 1. Manageable password strategy? How often should we change? Use random?
2. Password Managers? How do they work? Should we consider using one (any particular one) or just use the old fashioned method of writing our passwords down?
3. Received emails lately (from a different person/name each time, but all structured the same) saying that they know my password - they typed it into the email and that they know even more about me and will expose more details if I don't respond to their email. I've been ignoring the emails, but should I address it somehow? Reporting to some authority, the police for example? More importantly, how do we prevent this from happening again?
02:43:14 Jay G: moderate
02:43:16 Fox Reymann: I've seen Kevin Mitnic movie:) that's all I know
02:43:42 Austin Moon: I’m familiar with the standard cognitive biases that make people vulnerable — i.e. the reciprocity principle, etc. And, spear-phishing, etc. Excited to learn more!
03:23:38 Erica Woods: Have questions? Enter into the Chat box!
03:24:32 Nurdan Yaylaz Moon: I have to go but thank you so much! Great talk!
03:24:43 Austin Moon: Do you have any thoughts on bolt-on services to banking? I.e. Quickbooks, Mint.com, and others which require linking bank access?
03:25:17 Erica Woods: REMINDER - You can access 6 other training recordings we've done via https://tech4goodtampa.org/resources/ (topics = SEO, Google Ad Grats, WordPress Give (donation plug-in for WordPress websites), social media, intro to TechSoup and How to Choose a Website Host
03:25:59 Austin Moon: Do you have any advice on using and/or choosing a password manager? Are broswer-based password managers a good idea to use?
03:33:27 Austin Moon: I always worry about my elderly family members accidentally downloading keylogger malware. Is that a reasonable thing to worry about?
03:36:47 Innocent Kahigi: Thanks Guys... was awesome learning. Stay Safe during themis COVID-19
03:39:11 Austin Moon: any thoughts on end-point protection for mobile phones?
03:39:15 Austin Moon: thank you for answering all of these!
03:40:55 Revocatus: I am from Tanzania, We have much of social engineering based on mobile money transaction (eg. M-pesa,Tigo-Pesa and others) its appears much to mobile money user in my country...can u talk more about it base on mitigation?
02:25:25.470 --> 02:25:26.280
Eli van der Giessen: Are you doing
02:25:26.940 --> 02:25:29.850
Erica Woods: I'm doing pretty well. Oh, there we go.
02:25:30.390 --> 02:25:37.350
Eli van der Giessen: There we go. This you know how things running for last one here. Let me make you the CO host done
02:25:38.010 --> 02:25:39.540
Eli van der Giessen: Nice. Boom.
02:25:41.460 --> 02:25:49.560
Eli van der Giessen: Perfect. So, uh, so you're now in which means you can, you know, from the participant view. If you click that, basically, you know,
02:25:49.980 --> 02:26:00.390
Eli van der Giessen: Make sure that you can mute anyone or remove them. If it ever becomes necessary. I don't think so. But, uh, but certainly that new power is always good. If someone comes in with too much background noise.
02:26:01.080 --> 02:26:04.920
Erica Woods: Yeah, for all the webinars. I do. I automatically mute folks.
02:26:05.070 --> 02:26:08.700
Eli van der Giessen: Yeah, these people come in muted by default but
02:26:10.680 --> 02:26:13.740
Eli van der Giessen: But they do have the power to unmute themselves in the way this setting my setup.
02:26:14.610 --> 02:26:18.390
Erica Woods: I like it. I just started using the zoom. A couple weeks ago. I'm a huge fan.
02:26:19.230 --> 02:26:27.810
Eli van der Giessen: Yeah, I look when I first discovered zoom. A couple years ago. I'm like, Oh, this one. Mostly it's the same as every other one, except for it.
02:26:28.380 --> 02:26:38.670
Eli van der Giessen: Works a bit more reliably Justin that like participants can get in a bit more easily. Now obviously that led to all of the like lack security settings complaints later on. That's the
02:26:38.730 --> 02:26:43.920
Eli van der Giessen: Other side of that coin, but that lack security settings meant it was really easy to get people in
02:26:44.910 --> 02:27:03.060
Erica Woods: Yeah, it's funny on equal you're in our Baltimore tech for good group Facebook group, but um how one of the nonprofit's was like struggling initially with zoom. Like how do I set up the security controls while keeping things open but making sure that we don't get zoom bombed.
02:27:06.330 --> 02:27:07.530
Eli van der Giessen: Oh yeah, they actually just
02:27:08.760 --> 02:27:13.020
Eli van der Giessen: Change the default setting. I think last week to have everyone by
02:27:14.160 --> 02:27:17.310
Eli van der Giessen: exactly to the room. See how that waiting room feature.
02:27:22.560 --> 02:27:24.240
Erica Woods: And I liked that you set it up to
02:27:25.260 --> 02:27:31.530
Erica Woods: Require somebody registers to because I haven't done that with webinars yet.
02:27:32.070 --> 02:27:35.130
Eli van der Giessen: Yeah, so just mean the pre registration.
02:27:35.670 --> 02:27:44.550
Eli van der Giessen: Mm hmm. Right. That way, you have the email of all the people who didn't actually make it, you know, he's still send them the information afterwards. So it's the only advantage to that particular approach.
02:27:44.850 --> 02:27:49.890
Erica Woods: Okay, cool. Yeah, I might start setting that stuff up for my webinars to that I do for apex.
02:27:50.580 --> 02:27:58.470
Eli van der Giessen: Right, right. I see the next one. Yeah, is you're using the apex platform. So somewhere so summer here somewhere there which is yeah, which works out fine.
02:27:59.400 --> 02:28:05.400
Erica Woods: So I have a question. When I bounced because right now I'm not looking at the zoom. I'm looking at the. Can you still see me.
02:28:07.170 --> 02:28:09.510
Eli van der Giessen: Yes. Yeah, I'm still seeing your cameras documentary
02:28:10.200 --> 02:28:17.010
Erica Woods: Okay, cool. I didn't know you're recording already, you want to pause the recording until we get started, or
02:28:17.070 --> 02:28:24.390
Eli van der Giessen: Well, I'll forget is the issue to restart it. So I always just keep it going. And then just at the end is hot like the first 20 minutes off or something.
02:28:27.990 --> 02:28:35.610
Eli van der Giessen: Keep on learning I Eric here. Let me make you co host on this as well. So that way you can also, you know, go through
02:28:36.270 --> 02:28:40.590
Eli van der Giessen: Control any audio kind of issues and also it's been nice because the two of you then consider
02:28:41.220 --> 02:28:50.490
Eli van der Giessen: One of you can always keep an eye on the chat window is sometimes people are shy and want to just say, like, I don't want to ask a question, but I'll put it in the chat and then some can read it out for the on their behalf.
02:28:50.820 --> 02:28:55.050
Erich Kron: Yeah, I'm, what I'm going to do. So I'm going to be sharing my screen. Right.
02:28:55.080 --> 02:28:55.830
Erich Kron: That's the idea here.
02:28:56.310 --> 02:28:59.790
Eli van der Giessen: Yep. Totally. And that should work on the bottom up link at the bottom.
02:29:00.360 --> 02:29:06.210
Erich Kron: Okay, cool. First step, though, is I just got off the other webinar. I need to do a bio break real quick. So I'll be back.
02:29:06.450 --> 02:29:08.040
Erich Kron: In just a moment. Okay, take
02:29:08.040 --> 02:29:09.540
Eli van der Giessen: Care of yourself. Absolutely.
02:29:09.660 --> 02:29:10.680
Erich Kron: All right well moment.
02:29:14.250 --> 02:29:18.540
Erica Woods: I'm going to Eli with the questions you sent me yesterday. I'm just going to go ahead and
02:29:19.680 --> 02:29:20.670
Erica Woods: Enter those
02:29:22.350 --> 02:29:28.440
Eli van der Giessen: Just so we got great idea. Yeah. Someone was a super keen or and like said like, actually, here's questions. I definitely want to answer, try
02:29:29.310 --> 02:29:31.080
Eli van der Giessen: I thought that was pretty great, actually.
02:29:31.740 --> 02:29:39.030
Erica Woods: I wouldn't have to Eric even knows that you guys um that text to fuses know before I think that's so cool.
02:29:39.960 --> 02:29:49.860
Eli van der Giessen: Yeah, I was really excited myself. I'm like, oh, I know that platform views didn't like you know for basically every time we bring in, like, some kind of new features like and here's the security things we need to know about it.
02:29:50.490 --> 02:29:58.230
Erica Woods: Yeah. Did you know that I my building actually looks like where I live, physically, I look at the new before headquarters
02:29:59.490 --> 02:30:02.160
Eli van der Giessen: So here, like it's in your backyard fun
02:30:02.760 --> 02:30:14.550
Erica Woods: Yeah, one of my actually one of my best friends, Katrina hooked us up. She works on the same floor as Eric and she's been in the tech for good Facebook group for
02:30:14.790 --> 02:30:16.950
Erica Woods: I don't know, a year plus and we
02:30:17.010 --> 02:30:31.980
Erica Woods: kept talking about, like, oh, we need to get you know them involved because know before, especially in Tampa, you know, they're so connected and they give back so much the key. Finally, and that Eric. He set up a meeting like two months ago.
02:30:34.200 --> 02:30:34.650
Erica Woods: Was like
02:30:34.770 --> 02:30:37.890
Eli van der Giessen: Exactly making use of the power of our local networks.
02:30:39.090 --> 02:30:40.620
Erica Woods: I should have done it, like, a year ago.
02:30:45.030 --> 02:30:47.340
Eli van der Giessen: There was, say, No time like the present.
02:30:48.090 --> 02:30:52.740
Erica Woods: Yeah. So how do you want those to work, do you want to introduce Eric, or do you
02:30:52.980 --> 02:30:55.950
Eli van der Giessen: Know, I think this is going to be totally your show. In fact,
02:30:56.370 --> 02:31:08.670
Eli van der Giessen: I've got a poached egg, which is coming up. I think in like two minutes off the oven. See, here's my scones. My biscuits and a burning myself so I'm not going to show them off.
02:31:09.870 --> 02:31:23.910
Eli van der Giessen: But, but the back there. So yeah, actually, I think I'll need to disappear. One thing I might do before we dive in, if, if you're comfortable with that is, I can also set this up to live stream directly to the Facebook page of Net Squared.
02:31:25.650 --> 02:31:44.820
Eli van der Giessen: So yeah, if that's so, once I start that up. Then now as well and get that going. And then really, I think I'll probably just log out let you run the show and I'll send you the link to the video and the transcript and and all the other pieces in the chat transcript probably later today.
02:31:45.900 --> 02:31:48.330
Erica Woods: No, that's perfect. Yeah, I'm just gonna kick it off.
02:31:49.650 --> 02:32:01.470
Erica Woods: Explain how we got connected talk about how awesome know before is for a hot second and then the show is his but he, like, just out of curiosity, how many people are on your network Facebook.
02:32:02.310 --> 02:32:03.270
02:32:05.340 --> 02:32:14.910
Eli van der Giessen: Know it all, but it's something I haven't used too much until I started like streaming video and it's always sort of been Twitter has been the more engaged platform for us.
02:32:16.500 --> 02:32:18.120
Eli van der Giessen: If I actually look at it, though.
02:32:19.950 --> 02:32:23.250
Eli van der Giessen: He did pretty good to tell this kind of stuff.
02:32:26.760 --> 02:32:27.810
Erica Woods: Co organizer.
02:32:28.020 --> 02:32:31.320
Eli van der Giessen: Here in Tampa. Oh, nice. I'm Eli.
02:32:32.970 --> 02:32:34.620
Erica Woods: That's Eric, you haven't met him yet.
02:32:35.220 --> 02:32:37.050
Erica Woods: Stephen Trina
02:32:37.620 --> 02:32:38.190
02:32:41.760 --> 02:32:42.240
Erich Kron: What was that
02:32:44.160 --> 02:32:44.760
Erica Woods: I like it.
02:32:46.440 --> 02:32:57.330
Eli van der Giessen: So, so, yeah. There's about 3000 on the Facebook page. But you know, I think we'll probably get like 100 interactions on some server on that Facebook thing. Who knows how they actually measuring
02:32:58.110 --> 02:33:00.480
Erica Woods: Yeah, that's cool. Eric. Are you cool with this.
02:33:02.610 --> 02:33:05.580
Erich Kron: I'm sorry he kind of your volume is going down a little there.
02:33:06.000 --> 02:33:08.910
Erica Woods: How will the thing I've tech support here am I better
02:33:09.180 --> 02:33:15.420
Erich Kron: Yeah, yeah, it starts and then it kind of goes down a little bit. It's, yeah. It's like, yeah. Anyways, what was that
02:33:16.050 --> 02:33:20.610
Erica Woods: Um, are you comfortable with this being streamed through Facebook for that.
02:33:21.150 --> 02:33:25.740
Erich Kron: Tech for good grades. I am comfortable with it going wherever you want to to go. Okay.
02:33:25.770 --> 02:33:26.880
Erich Kron: Okay, cool.
02:33:29.460 --> 02:33:32.130
Erich Kron: Yeah, no worries for me at all whatsoever.
02:33:33.690 --> 02:33:40.350
Erich Kron: So flow wise, you're going to start off. Yeah. And then you're going to throw to me, Oscar I'll share my screen actually
02:33:40.830 --> 02:33:41.580
Erica Woods: Yeah, if you want to do.
02:33:42.360 --> 02:33:43.710
Erich Kron: Yeah, let's do that.
02:33:45.180 --> 02:33:50.490
Erich Kron: Doo doo doo doo Baba, Baba. Let's see. Let's do it.
02:33:52.560 --> 02:33:55.920
Erich Kron: here like so. Do you have the whole screen up there.
02:33:56.550 --> 02:34:02.610
Erich Kron: Sure do. Awesome. Awesome. I'm going to do a slide change real quick. Did that change.
02:34:05.700 --> 02:34:06.450
Erich Kron: Did that change for you.
02:34:06.900 --> 02:34:13.170
Erich Kron: Yes. Okay. Cool. That's all work and Happy Days good goodness. Okay, so
02:34:14.280 --> 02:34:16.530
Erich Kron: You're going to start off, you're gonna do the intro right
02:34:16.950 --> 02:34:21.180
Erich Kron: Now I'm going to throw it to me. Then I'm going to do whatever. And then in the end, I'm going to come back to you and say,
02:34:21.540 --> 02:34:25.050
Erich Kron: Hey, Eric, do we have any questions from our studio audience.
02:34:25.110 --> 02:34:27.000
Erica Woods: Or something. Are you gonna say it just like that.
02:34:28.680 --> 02:34:29.550
Erica Woods: Unless you do
02:34:30.330 --> 02:34:31.680
Erich Kron: I really can
02:34:32.190 --> 02:34:42.390
Erich Kron: I have no shame. I'm just going to warn you right now. It just depends on your audience and whether or not they're going to laugh at it or corporate board members are going to go. These people are idiots. You know, it's
02:34:42.660 --> 02:34:45.780
Erica Woods: I don't have some fun with it, especially in the current climate.
02:34:46.590 --> 02:34:50.850
Erich Kron: You know what, and and yeah, everybody's like a little stressed.
02:34:51.360 --> 02:34:55.470
Erich Kron: I'm really trying to keep the energy up as much as possible.
02:34:56.490 --> 02:35:06.450
Erich Kron: Just because of that. I mean, people are at home. And they're like, Oh, great. Look, it's either it's it's what's it day. I don't even know what day of the week it is, you know, and so trying to keep it going.
02:35:07.380 --> 02:35:12.930
Erica Woods: I like it a lot. Okay, cool. And hey Eric, did you know that text actually uses know before
02:35:13.890 --> 02:35:21.270
Erich Kron: You know, I think I did hear that rumor. Yeah. Yeah. It's not surprising. I mean, honestly, we're like 32,000 customers or something now.
02:35:21.870 --> 02:35:22.950
Erich Kron: Oh, yeah.
02:35:23.460 --> 02:35:35.910
Erica Woods: That's crazy. I didn't realize that until I got it set up with Eli and just say, you know, Eli runs all of the tech for good groups across the world. Come on your Eli 132, something like that.
02:35:40.290 --> 02:35:40.830
Erich Kron: Nice.
02:35:50.100 --> 02:35:52.170
Erica Woods: Your, your volume went down considerably.
02:35:52.440 --> 02:35:53.610
Eli van der Giessen: Because my little
02:35:55.710 --> 02:35:57.900
Eli van der Giessen: Let me adjust my mic to the right place.
02:35:59.280 --> 02:36:10.680
Eli van der Giessen: So, yeah, yeah. There's like 120 groups scattered across 41 countries and they've all got an Erica instead of championing the connections between like the tech industry and LOCAL NONPROFITS
02:36:11.490 --> 02:36:12.150
Erich Kron: That's awesome.
02:36:13.770 --> 02:36:26.790
Eli van der Giessen: Even more awesome is my coffee just came off so stoked. So I am going to disappear. Now the Facebook Live stream did not work as it sometimes does about one quarter of the time for reasons I don't understand.
02:36:27.780 --> 02:36:35.940
Eli van der Giessen: But I'll definitely have the video recording available and I'll share that YouTube link with the two of you later today. Once I have that download
02:36:36.870 --> 02:36:40.500
Eli van der Giessen: Cool. Lovely. ERIC Yeah, correct. Yeah, go ahead.
02:36:41.100 --> 02:36:50.520
Erich Kron: Oh, yeah. So yeah, at the end of this, are you going to do the closing statement than Erica, so I get done. I hit my thanks everyone slide and we go to Q AMP. A and then you do the closing right
02:36:50.880 --> 02:37:03.360
Erica Woods: Yeah, that sounds great. And then just as a heads up, Eric, I already entered three questions that somebody submitted in the chat box, mostly, so I don't forget, y'all. But we had a couple that were pre entered, which is good.
02:37:03.690 --> 02:37:04.830
Erich Kron: Awesome. Perfect.
02:37:06.570 --> 02:37:10.980
Erich Kron: Groovy, groovy Groovy password managers. Yes, I love those questions.
02:37:13.350 --> 02:37:14.040
Erich Kron: Okay, cool.
02:37:15.180 --> 02:37:17.760
Erica Woods: How often do you get stumped by a question, Eric.
02:37:18.660 --> 02:37:28.530
Erich Kron: Honestly, and I mean I almost feel like arrogance saying this, but not very often and and I say that just because I've got a real broad
02:37:29.160 --> 02:37:42.210
Erich Kron: Background In this stuff. The only things that actually stumped me. Really are when they get into like platform specific stuff like on Tuesdays. If you do this and you click that button does it send 240 2% of the people are
02:37:43.320 --> 02:37:54.060
Erich Kron: got no idea right and pricing I avoid knowing our pricing, other than real rough numbers like the plague, no pun intended.
02:37:55.080 --> 02:37:58.590
Erich Kron: But I really start saying like the coded. I wonder if that'll change right
02:37:59.310 --> 02:37:59.730
02:38:00.810 --> 02:38:09.450
Erich Kron: But I avoid that, because that's just not my place to be. So those are the things that stumped me. Technically, though I pretty much have an opinion on everything.
02:38:11.130 --> 02:38:17.490
Erica Woods: Well, I'm sure if you did get stumped. You've got an array of resources that know before that can help you out.
02:38:17.850 --> 02:38:25.860
Erich Kron: Yeah. And the thing is, I'm not afraid to say that's not my area of expertise or, you know, I really don't know. But this is kind of how I see it.
02:38:27.810 --> 02:38:30.180
Erich Kron: But I mean, I've been doing this for so long.
02:38:31.380 --> 02:38:34.800
Erich Kron: Very little surprises me. At this point, I mean,
02:38:35.370 --> 02:38:37.950
Erich Kron: He's kidding. Yeah, it's that again, you know,
02:38:38.580 --> 02:38:43.200
Erica Woods: How long have you been doing kind of a role of cyber security and cyber security education.
02:38:44.070 --> 02:38:49.380
Erich Kron: Cybersecurity like it in cyber security since 95 I think
02:38:50.460 --> 02:39:05.700
Erich Kron: And I've been in medical I've been in manufacturing. I've been in the certification realm. Honestly, my work for is c squared it as a cis sp, the kind of like top search staff.
02:39:07.350 --> 02:39:13.530
Erich Kron: And then, you know, I've been in all kinds of different areas. I was actually used to fix a biomedical equipment to
02:39:14.790 --> 02:39:20.460
Erich Kron: Actually fly around a different hospitals and fix fix the machines that do cancer diagnostics testing.
02:39:21.000 --> 02:39:24.570
Erich Kron: It's called insight to hybridization and immuno history chemistry.
02:39:27.720 --> 02:39:38.790
Erich Kron: Yeah i got i have so much of a background, just in many different things. I also I'm one of these people like I do 3D printing. I have a laser engraver and my garage. I do CNC milling. I do.
02:39:39.450 --> 02:39:46.200
Erich Kron: Cars. I mean, I used to do rock crawling in Arizona. I mean, you name it. I've dabbled in it drives my wife crazy
02:39:47.310 --> 02:39:53.220
Erich Kron: But I just done a lot of different things over the course of my life.
02:39:54.180 --> 02:40:00.270
Erica Woods: I think Steve isn't hurting out over here. He got real jobs yesterday about his new storage array that came in the middle.
02:40:00.570 --> 02:40:01.890
Erich Kron: Oh, nice, nice.
02:40:02.430 --> 02:40:08.640
Erich Kron: Yeah, I had a crazy home lab here for a while to I helped a group tear down a
02:40:10.020 --> 02:40:28.980
Erich Kron: A. What do you call it a data center thing. And so they gave me a machine. Now this machine was 40 physical cores. So it was Adele are 810 40 physical cores at hyper threading and a terabyte of RAM.
02:40:30.030 --> 02:40:47.070
Erich Kron: It had so much RAM in it. I had to actually only plug one of the power supplies into my ups because when it would boot to charge all the ram it was kicking my power supply my ups offline. And I had this running in my my garage for about a month and a half until the power bill came in.
02:40:47.310 --> 02:40:48.060
And the wife's like,
02:40:49.230 --> 02:40:49.770
Erich Kron: Wait a minute.
02:40:52.380 --> 02:40:58.260
Erich Kron: And what am I gonna do with the machine like that. Right. They made a wicked cool Minecraft server but I
02:40:59.310 --> 02:41:08.880
Erich Kron: Have to let it go. But that was part of the thing. Yeah, I had a whole like rack in my garage and all that. Now I'm down to just machine scattered all about. I do microcontroller stuff.
02:41:09.720 --> 02:41:19.320
Erich Kron: So Arduino type stuff. I haven't no scope. A similar scope sitting next to me here. I mean, I'm a total nerd like i said i I've stuck my hands and just about everything.
02:41:22.050 --> 02:41:23.310
Erich Kron: All right, we're about ready to start right
02:41:23.880 --> 02:41:25.590
Erica Woods: Yeah, so I want to get
02:41:26.640 --> 02:41:34.020
Erica Woods: While we give it a second for folks to join us. Um, I want to do a pulse check for everyone who's tuned in with us.
02:41:34.350 --> 02:41:50.670
Erica Woods: Use the chat box and just let us know. Where do you stand in terms of just your awareness with the concept of social engineering, which essentially how would you just quickly summarize what social engineering is for folks like myself, Eric, who really don't even understand it.
02:41:52.230 --> 02:42:01.860
Erich Kron: Yeah so social engineering and social engineering is a practice of manipulating human emotions, basically, to get something somebody wants
02:42:02.850 --> 02:42:11.730
Erich Kron: It's essentially scams and cons is social engineering using social norms, you know, things like some of the tricks are
02:42:12.570 --> 02:42:18.180
Erich Kron: A lot of companies have two doors in the front of their buildings, right, especially in cold climates.
02:42:18.570 --> 02:42:28.350
Erich Kron: So the first door is typically not locked it lets people get in and then they take a badge and they hit the second thing which has the latch to open it up if you open the first door and hold the door for somebody
02:42:28.890 --> 02:42:34.710
Erich Kron: They feel like they need to reciprocate and will often hold the second door for you, which is a secure door.
02:42:35.790 --> 02:42:45.540
Erich Kron: So if you have a badge that even looks legitimate you can show it to them and they go, okay, and then they let you in. So tricks like that using human behavior against us.
02:42:46.350 --> 02:42:55.830
Erica Woods: Wow, that was an awesome explanation. So quick pulse check in the chat box. Everyone who's tuned in with us today for our tech for good webinar through tech soup.
02:42:56.340 --> 02:43:11.580
Erica Woods: Um, what's your knowledge level with this concept is it completely new to you. Do you know a little bit you know if there's anything you're hoping to learn today. Throw that in there. But let's just hear from you guys kind of where you stand with your knowledge base around this concept.
02:43:17.730 --> 02:43:20.370
Erica Woods: I've seen the Kevin Mitnick movie. What is that
02:43:21.690 --> 02:43:30.480
Erich Kron: I didn't know he had a movie. He's done a number of books, maybe you're thinking of a frank Abigail that was Catch Me If You Can.
02:43:30.990 --> 02:43:31.440
02:43:33.210 --> 02:43:35.910
Erich Kron: What's his name Leonardo DiCaprio right the
02:43:36.540 --> 02:43:48.630
Erich Kron: Yeah, from Titanic, of course, that I've met him a couple of times actually and fascinating, fascinating guy. And that's what he used pretending to be an airline pilot right just going up in an outfit and, you know,
02:43:49.230 --> 02:43:56.580
Erich Kron: One of the things we talked about in the social engineering side of the world is with a clipboard and some confidence. You can get just about anywhere you need to go.
02:44:04.320 --> 02:44:06.390
Erica Woods: Well, thank you for that. Um,
02:44:06.570 --> 02:44:20.310
Erica Woods: And also for sharing that link. I'm really excited to introduce Eric, but a little bit about me first and just kind of this program. If you aren't familiar with the Net Squared tech for good groups and my name is Eric woods. I know a lot of you.
02:44:20.790 --> 02:44:30.150
Erica Woods: I lived in Baltimore for a very very long time and had the privilege of being one of for folks who started the Baltimore Net Squared X for good group.
02:44:31.200 --> 02:44:41.460
Erica Woods: Circa November 2014 and what I think is still amusing is that when we started that group because we saw a real need in Baltimore.
02:44:41.790 --> 02:44:49.980
Erica Woods: To bring together technology folks like Eric who really want to help nonprofits have a platform to do so.
02:44:50.370 --> 02:44:58.590
Erica Woods: This nonprofits like yourself who are just in dire need of understanding these core concepts like social engineering
02:44:59.010 --> 02:45:16.230
Erica Woods: But you can keep yourself safe and also anyone that you're helping within your community. So when we started that group. I had no idea it was actually a global effort by tech soup. Right. And then a year later, we came on the Baltimore tech for good group became associated with
02:45:17.400 --> 02:45:28.170
Erica Woods: And then a couple years later, I very impulsively move to the Tampa, Florida area, as Eric can relate as a neighbor, it is quite gorgeous here. And so I want to go home.
02:45:28.590 --> 02:45:36.720
Erica Woods: And so about a year after moving here. I started with Steve and Naomi the tech for good group here in Tampa Bay.
02:45:37.320 --> 02:45:45.720
Erica Woods: And our goal is to build our group and all the other tech for good groups across the world on is to really again bring together experts like Eric
02:45:46.050 --> 02:45:59.730
Erica Woods: To explain and educate these concepts of social engineering websites WordPress data platforms anything in that technology or social marketing space social media marketing space.
02:46:00.090 --> 02:46:11.820
Erica Woods: We want to make sure that you guys are educated no cost. No buts, and then that way you can use that education and continue to do the amazing work through your nonprofit initiatives.
02:46:12.270 --> 02:46:22.410
Erica Woods: So on that note, Eric Cron I had the pleasure of meeting. ERIC A couple months ago. He's a security awareness advocate for know before, which is one of the leading
02:46:22.740 --> 02:46:31.290
Erica Woods: organizations that provide cyber education to what did you see your customer bases Eric over 30,000 customers across the world.
02:46:31.530 --> 02:46:33.810
Erich Kron: Yeah, about 32,000 customers now.
02:46:34.230 --> 02:46:47.820
Erica Woods: Who and Eric as you guys heard from the introduction very knowledgeable on this concept and that's his job he's one of three advocates and evangelist at know before and his goal in life is to educate folks on
02:46:48.330 --> 02:46:55.890
Erica Woods: Concepts help organizations, stay safe educate their employees. So Eric, so happy to have you here today. Thank you.
02:46:56.340 --> 02:47:04.860
Erich Kron: Yeah, thrilled to be here. You know love offering whatever help we can any way we can to folks, especially in the nonprofit, not for profit community.
02:47:06.060 --> 02:47:08.940
Erich Kron: We know that you know the attacks are coming. Money's limited
02:47:10.440 --> 02:47:24.990
Erich Kron: And and this is something that's been going on since the dawn of time, quite frankly, it's something that we've been struggling with for years and, you know, like I said, I'm always happy to share the the experience that I have with anyone that I can to help them.
02:47:26.400 --> 02:47:30.090
Erich Kron: So having said that, let's hop right into the slides. Then what do you say Erica.
02:47:31.470 --> 02:47:40.020
Erich Kron: Alright, cool beans. Alright so social engineering one on one. We're not going to get super deep into this because I can do this talk for hours on end and get into
02:47:40.410 --> 02:47:48.450
Erich Kron: All kinds of things behind it, but I want to make sure that people understand a little bit more. We've seen you know through the chat that there's some people that have moderate knowledge about this.
02:47:49.020 --> 02:47:54.840
Erich Kron: That are familiar with it. So I'm going to get into some of the principles behind it and how it works. Please, please, please.
02:47:55.380 --> 02:48:02.850
Erich Kron: If you have questions. Let's put them in the chat and I will be happy to get to as many of them as I can always happy to do that.
02:48:03.390 --> 02:48:08.400
Erich Kron: It is a fascinating subject for people like me. I mean I travel a lot for work.
02:48:09.030 --> 02:48:18.120
Erich Kron: This is my job is to cruise around the country and talk. Well, one of the things I also like to do is sit back and watch human interactions. And of course,
02:48:18.420 --> 02:48:30.870
Erich Kron: airports are an amazing place for that sort of thing. You see people in all kinds of different situations from joy and elation to seeing people they haven't seen for a while to the ever so
02:48:31.830 --> 02:48:43.080
Erich Kron: Stressful situations when things go wrong when flying right and so I try to be a student of people and try to see how they react to things. It's a fascinating subject for me.
02:48:43.410 --> 02:48:50.640
Erich Kron: And I like to share that with other people too. Because I know a lot of people also go wow that's it's kind of neat to peel that back so
02:48:50.910 --> 02:48:54.960
Erich Kron: We're going to talk about how attackers, especially in the fishing and what's called smashing
02:48:55.350 --> 02:49:05.190
Erich Kron: Which is SMS fishing or text message fishing spaces work and how they manipulate people using social engineering techniques. That's what we're going to talk about
02:49:05.790 --> 02:49:15.150
Erich Kron: So a little bit about me again, I've been in the industry since the mid 1990s, I have been in all kinds of different places I've been in
02:49:15.750 --> 02:49:21.630
Erich Kron: Manufacturing. I've been in medical I was with the US Army for about 10 years as a contractor.
02:49:22.020 --> 02:49:30.930
Erich Kron: I ended up being the security manager at the second regional cyber center Western Hemisphere, which is a mouthful. When you have to answer the phone like that every time.
02:49:31.650 --> 02:49:37.920
Erich Kron: But I've been in, like I said, just about every kind of category of place to be over the years now.
02:49:38.760 --> 02:49:47.730
Erich Kron: I've gained a lot of experience through that and I've seen a lot of things. And one thing that came became very, you know, very upfront to me and very
02:49:48.600 --> 02:49:58.230
Erich Kron: real to me kind of early on in my career is how much people were a problem. Now I use that term lightly because it's not their fault. Right.
02:49:58.950 --> 02:50:12.390
Erich Kron: But the people from a technical standpoint that people are where a lot of things go wrong phishing attacks like that social engineering attacks are where 91% of successful data breaches happen.
02:50:13.080 --> 02:50:19.680
Erich Kron: It's a terrifying number but it's true. And what we have to do those we have to make sure that people
02:50:20.790 --> 02:50:28.980
Erich Kron: Understand how to defend themselves against this sort of thing, right. So that's why I'm so passionate about this and trying to help people do that.
02:50:29.370 --> 02:50:40.350
Erich Kron: Now our organization again down here in Tampa as Erica said once you come here, it's hard to go home. The weather is good unless you show up in late July or August when it's about 200% humidity.
02:50:41.220 --> 02:50:48.420
Erich Kron: But you know we got about 31,000 customers like we said maybe 32,000 customers. Now we do security awareness training.
02:50:48.900 --> 02:50:53.880
Erich Kron: And simulated fishing. It's a platform we provide to people online on demand.
02:50:54.210 --> 02:51:04.380
Erich Kron: Where they can train their people and then turn around and do simulated phishing attacks on them as well. And the reason for doing that is to give them a chance to practice.
02:51:04.680 --> 02:51:16.740
Erich Kron: With stuff that if they mess up. It doesn't hurt anything as opposed to practicing what they learned on real phishing attacks right kind of makes sense when you think about it that way. And it's very, very effective how this works.
02:51:17.670 --> 02:51:27.750
Erich Kron: So it's not a cyber security conference type you know presentation. If we don't start off with a quote by Sun Tzu, you may have read this book kind of a recent one.
02:51:28.980 --> 02:51:40.200
Erich Kron: All warfare is based on deception is the quote here and I like this, although it is a little bit scary and I'm not a big proponent of what's called fun, which is fear, uncertainty and doubt. But let me tell you.
02:51:41.010 --> 02:51:47.640
Erich Kron: When we talk about warfare in this case that is actually what we're dealing with. And I say that because
02:51:48.330 --> 02:51:57.900
Erich Kron: What we have is we have nation states involved with this. We've all heard that China, North Korea and Iran and all these have these incredible hacking programs.
02:51:58.290 --> 02:52:07.350
Erich Kron: Absolutely. They do. It's nation states and a lot of cases, it's also organized crime. You know the old people used to go around with bats and break knees for not paying payments or whatever.
02:52:07.650 --> 02:52:15.990
Erich Kron: This organized crime stuff has now gotten into this because it's so beneficial and so lucrative there they are making billions of dollars a year through this
02:52:16.320 --> 02:52:24.240
Erich Kron: So these groups are getting in there. What's important to understand is the people that are running. Most of the attacks. We see out there these days.
02:52:25.020 --> 02:52:33.480
Erich Kron: They're going to be cyber crime gangs nation states or some sort of an organized crime. It is not the kid in the basement with a hoodie on
02:52:34.260 --> 02:52:43.890
Erich Kron: eating pizza drinking Mountain Dew. Okay, that's not what we have going on here. This is actual organized businesses and they treat them like businesses.
02:52:44.730 --> 02:52:54.660
Erich Kron: So the key is the deception part here. And what that means is when it comes to deception, what they're doing is they're messing with how we see things
02:52:55.380 --> 02:53:04.170
Erich Kron: They're deceiving us into clicking on links opening documents, doing things like that. And that's true all warfare is based on
02:53:04.920 --> 02:53:11.940
Erich Kron: Deception. Now we're going to start off, we're gonna have a little fun here, this is kind of what we're talking about today. Perception versus reality due to loop.
02:53:12.630 --> 02:53:20.970
Erich Kron: How these different components are being subverted and how we can defend ourselves. So let's start with a perception versus reality we're going to have a little bit of fun.
02:53:21.420 --> 02:53:29.940
Erich Kron: Don't know if you've ever seen this before, but I am going to channel the magic of PowerPoint into a card trick here. So let's start with this.
02:53:30.240 --> 02:53:44.670
Erich Kron: I want you to pick a card. Pick any card, remember what that card is I'm going to make that card disappear. Are you ready, here we go through the magic of this PowerPoint transformation here and that's where all the power is it's in PowerPoint. I've removed your card. Have I
02:53:45.720 --> 02:53:54.780
Erich Kron: So everybody here should be nodding their head. Yes, yes, yes. Well, how did we do that, how did we remove your card. Well, it's actually fairly simple.
02:53:55.410 --> 02:54:10.860
Erich Kron: First thing we do is we get you to focus on removing a card. So you pick a card and you stick your eyes on that card right this is going to be the deck that you originally chose from. And then what we do is we replace it with this one. What do you notice about them now.
02:54:12.510 --> 02:54:20.340
Erich Kron: Right. They are completely different. But there's patterns in play here. Red King red, black, King on the left.
02:54:21.120 --> 02:54:28.560
Erich Kron: Black Queen Black Jacks on the right in that pattern. Now if I replace that second deck there with a bunch of, you know, number of cards.
02:54:29.310 --> 02:54:37.560
Erich Kron: One through whatever you know or ace through whatever you would have noticed immediately right but the patterns are there to kind of fool you.
02:54:37.890 --> 02:54:49.950
Erich Kron: Where you don't see that. And that's the whole idea here. This is just a misdirection by getting you to focus on one card. We're misdirecting the fact, who were basically yanking the tablecloth out from underneath.
02:54:50.550 --> 02:54:57.450
Erich Kron: All of the stuff. So this is just an example of one of the ways that our brains work and how easy it is to kind of
02:54:57.990 --> 02:55:02.340
Erich Kron: Deceive that now that the truth is, once you know how this trick is performed
02:55:02.850 --> 02:55:12.780
Erich Kron: You're gonna nod your head and go yeah I know how that's done. Right. But the first time it happens to be the first time you see this. It's kind of like how did that happen right and that's a usual response.
02:55:13.770 --> 02:55:19.980
Erich Kron: But that's how our brain works. Once we're familiar with the way that the attack works. Boom. We know how to defend ourselves against it. Right.
02:55:21.180 --> 02:55:29.160
Erich Kron: So understanding the root of deception that comes down to our brains. Our brains are wonderful lovely fantastic things but
02:55:30.090 --> 02:55:36.240
Erich Kron: They do things that we don't necessarily realize that they do and that is filter interpret and present reality.
02:55:37.020 --> 02:55:47.100
Erich Kron: We always think what we see what we hear what we taste what we smell all of these things are reality. But what if I told you, your brain applies filters to these
02:55:47.520 --> 02:56:02.100
Erich Kron: And your brain applies filters. These basically since our childhood when we started learning. We see that if I do this. This occurs. Now our brain is already wired to understand that this will mean this. Okay.
02:56:03.540 --> 02:56:13.770
Erich Kron: That happens very often there's other things that happened to that with it that are pretty eye opening, if you will. Who remembers what color the dress was right.
02:56:14.730 --> 02:56:24.420
Erich Kron: Blue and black or Golden white. Okay, two people. Same people are same place. Looking at the same monitor see two different things. That's your brain filtering one thing
02:56:24.690 --> 02:56:37.050
Erich Kron: And doing another. There are some physical characteristics as well. But even if you know both of them are in there, you can only see one or the other, without a lot of effort. Same with that sound Yani and Laurel. We all saw that or heard that.
02:56:37.950 --> 02:56:44.700
Erich Kron: It's the same thing. Even though you know that both sounds are in there. Once your brain locks hundred one. It filters out all the other stuff.
02:56:45.780 --> 02:56:48.510
Erich Kron: Another example I'm going to show my age here, folks.
02:56:49.770 --> 02:56:58.560
Erich Kron: So bear with me if you remember back in the day they used to have these pictures and you would stare at them. They were like 3D shapes or whatever and you stare at them.
02:56:58.920 --> 02:57:16.470
Erich Kron: And you cross your eyes for a little bit and eventually like a whale pops out of these strange, you know, square shapes whatever patterns, all of a sudden this whale pops out of there right now once you've seen that whale. When you look at that picture again you will see the whale again.
02:57:17.670 --> 02:57:24.360
Erich Kron: Your brain has filtered out all of the other stuff. It knows how to spot it. And last but not least is your nose.
02:57:25.470 --> 02:57:34.770
Erich Kron: So if you think about it, when you look even side to side, unless you're looking forward to thinking about it. Your brain does not see your nose in the way
02:57:35.190 --> 02:57:42.420
Erich Kron: Now I have a pretty good sized nose, it tends to get very much in the way. But our brain filters that piece out, it's something that does automatically
02:57:42.810 --> 02:57:51.840
Erich Kron: And this is the fascinating pieces to our brains. It applies filters interprets and presents reality.
02:57:52.770 --> 02:58:06.030
Erich Kron: And that's what the bad guys really go after they try to change how that happens. It's remarkable. So we'll talk a little bit about the loop. If you haven't heard about that the loop is something that was
02:58:07.050 --> 02:58:14.640
Erich Kron: It's, it's a very simple thing, but it's actually how our brains process a lot of things when it comes to decision making. So
02:58:15.420 --> 02:58:23.280
Erich Kron: When we talk about this magicians pickpockets con artists, all these they use the principles. We're about to discuss now the Buddha Luke started off in
02:58:23.970 --> 02:58:34.020
Erich Kron: Aviation and the idea behind this was, this is the decision process that was broken down by a guy named john Boyd. He's the one that created due to loop.
02:58:34.500 --> 02:58:43.290
Erich Kron: And these folks when they're doing these military maneuvers and stuff. They have to be very conscious about their decision making. This happens to be a picture of the Blue Angels.
02:58:44.130 --> 02:58:50.040
Erich Kron: And if you notice in his visor, he's flying very, very closely to some other aircraft.
02:58:50.850 --> 02:58:57.480
Erich Kron: There, you better have some pretty good decision making going on here because you know as you're adjusting your flight patterns and things like that.
02:58:57.750 --> 02:59:08.730
Erich Kron: It is absolutely critical that you get this right. But this is the source of this now it has been used as a process now for other types of
02:59:09.750 --> 02:59:28.380
Erich Kron: Business type things, and it has been applied across the board to other things. So what is the loop. Well, the loop is kind of like this, it is basically four steps. Observe, Orient, Decide and Act. Now what does this mean well, it means that
02:59:30.180 --> 02:59:40.380
Erich Kron: You observe where you're at. So you look around and you see what's going on, you orient. That means, where am I in this mess of things going on. OK.
02:59:40.770 --> 02:59:49.320
Erich Kron: So maybe I'm standing in a room and I see that I am in one spot and I really want to be one step to the right. Okay.
02:59:49.620 --> 03:00:01.110
Erich Kron: So I have oriented. I've seen the the area. I've observed it I've oriented myself as here. I want to be here. Now I decide what to do. This is the decision to make a step to the right and then I do the action.
03:00:01.560 --> 03:00:05.460
Erich Kron: I take that step to the right now my mind goes back through this loop.
03:00:06.030 --> 03:00:14.820
Erich Kron: I observe the area I orient, am I, where I want to be did that step. Do what I wanted to do or not. Right. And then I decided that maybe stay here or whatever.
03:00:15.060 --> 03:00:24.060
Erich Kron: And then you take the action, but that's kind of how the cycle constantly revolves in our brain when it comes to making this decision process now.
03:00:24.630 --> 03:00:37.350
Erich Kron: The ideal situation for a social engineer is to hijack the Udall loop by creating a knee jerk reaction. Okay. The idea is we effectively bypass those first three steps and make a point of this
03:00:38.910 --> 03:00:49.950
Erich Kron: When you bypass those first three steps, those first three steps are critical thinking steps we Observe, Orient and decide now if I as a social engineer.
03:00:50.850 --> 03:01:09.780
Erich Kron: Can change one of these or impact one of these it directly affects the action that I'm going to take and that's the key to this. So sometimes, what they do is you can change the observation piece of it right so in a social engineering
03:01:11.130 --> 03:01:20.250
Erich Kron: Kind of sense here, right, if somebody comes up to you and they're wearing a hard hat and a, you know, safety vest and a clipboard.
03:01:21.090 --> 03:01:29.730
Erich Kron: And they say, I need to get into, you know, I need to go look at this, you're going to make some assumptions, you're going to assume that they're here in official business so on so forth.
03:01:30.210 --> 03:01:37.980
Erich Kron: That is your observation side you're going to see them and you make a decision based on what you see. So they can tweak that by
03:01:38.520 --> 03:01:47.580
Erich Kron: What they're what they're wearing what they're doing right guy comes up to you in a black hoodie and says, Hey, you know, I need to get in here to to look at this right
03:01:48.060 --> 03:01:57.960
Erich Kron: Not to use the hacker black hoodie thing, but you're going to have a different reaction to them as somebody who is wearing a day-glo vest and maybe even carrying a ladder. I saw a fantastic
03:01:58.620 --> 03:02:05.880
Erich Kron: YouTube clips where people went into these different businesses. It was two people they're wearing those safety vests and they're carrying a ladder.
03:02:06.150 --> 03:02:12.750
Erich Kron: And they walked right into movie theaters. They walked into like just about anywhere because people just make that assumption.
03:02:13.050 --> 03:02:21.420
Erich Kron: By observing them. We have now influenced that and our action is then to let them in, or say nothing, because we just assume they
03:02:22.290 --> 03:02:33.840
Erich Kron: They should be there right so then orientation, if you can make people think that you want to be somewhere else or how you are in the mix of things. And this also applies to things that are applying to that person.
03:02:35.010 --> 03:02:43.290
Erich Kron: In other words, does it matter to me that this is going on. No, it doesn't. Okay, fine. And then of course the decision piece. So all of those things.
03:02:43.920 --> 03:02:56.160
Erich Kron: Impact the action that happens afterwards. And that's what's important to understand is how our brain works there and just how easy it is to kind of tweak some of these things, honestly.
03:02:57.060 --> 03:03:04.770
Erich Kron: So moving on. They subvert these components. That's what they do. And if you've ever seen this show. I love this show.
03:03:05.670 --> 03:03:11.310
Erich Kron: I don't recommend cuddling up with the kiddos to watch it. It's kind of a raw show, but it's about hacking
03:03:12.060 --> 03:03:22.140
Erich Kron: And it's called Mr. Robot. Now in this show they use real live attack scenarios. Okay. And they use real tools. It's not made up stuff.
03:03:22.680 --> 03:03:29.070
Erich Kron: One of the key things they use is social engineering and in one place. They're looking to break into this building.
03:03:29.610 --> 03:03:36.450
Erich Kron: That's supposed to be very hard to get into. And they're looking at photographs. Like I said, I just don't see how we're going to get into that and and you know
03:03:36.840 --> 03:03:45.750
Erich Kron: I think it was Elliot there it looks at it and says, well, I see eight vulnerabilities right here three vulnerabilities, something like that. And the guys going where, where, where, and he said, the people
03:03:46.380 --> 03:03:51.930
Erich Kron: And that's exactly how these bad guys, look at this. So I love this quote from his Elliot Alderson
03:03:52.590 --> 03:04:00.240
Erich Kron: I've never found it hard to hack. Most people if you listen to them and watch them their vulnerabilities are like a neon sign screwed into their heads.
03:04:00.660 --> 03:04:08.490
Erich Kron: And this is very true, as humans, again, we have these vulnerabilities. I see it all the time. Every year they have this thing called DEF CON.
03:04:09.330 --> 03:04:15.000
Erich Kron: And this is done in Vegas big hacking conference and there's something called the social engineering capture the flag.
03:04:15.510 --> 03:04:22.980
Erich Kron: What they do is they put people up in a soundproof booth with microphones inside and they sit in front of like 400 strangers.
03:04:23.400 --> 03:04:29.130
Erich Kron: But they start calling companies and they're tasked with gathering information from these people
03:04:29.850 --> 03:04:38.610
Erich Kron: In order to basically build a profile as if they were a hacker so they call these companies and they'll use different techniques where they'll be
03:04:39.600 --> 03:04:48.660
Erich Kron: Helpful or they may be a little bit aggressive or whatever. But they start asking them questions they find out what type of computer. They're running what operating system they're running
03:04:49.800 --> 03:04:59.010
Erich Kron: I've heard. I've seen them pull out what type of H ID cards, a little ones that you like scan badges with. I mean, this is all incredibly valuable information for hackers.
03:04:59.520 --> 03:05:13.860
Erich Kron: What they're anti viruses. Well, if I know what that is. Then I can work to bypass that. If I'm going to send malware your way. I mean, I've seen people give away tons of information and it's all done over phone calls. There's even a YouTube video out there if you want. If you look for it.
03:05:15.540 --> 03:05:23.040
Erich Kron: It's a person does social engineering on a self. It was, I don't remember if it was Verizon or one of the cell companies.
03:05:23.400 --> 03:05:29.520
Erich Kron: But basically, they took over an account of the reporter in a matter of a couple of minutes and she downloaded a
03:05:30.000 --> 03:05:38.730
Erich Kron: A YouTube video or how to YouTube video playing on a laptop in the background of a screaming baby. And if I'm not mistaken. The pretext or the story that she used was
03:05:39.210 --> 03:05:49.470
Erich Kron: My husband is deployed trying to fix this stuff. I don't know what's going on. And she actually got the cellular carrier to make her a primary on the account.
03:05:49.860 --> 03:06:00.030
Erich Kron: Just over the phone, right. This is the kind of stuff that happens and it's because this person wanted to be helpful because they thought this person was in trouble or having, you know, hardships.
03:06:00.600 --> 03:06:12.600
Erich Kron: This happened over and over again, it was, it was an amazing thing to watch how quickly this happens. That's what social engineering is that's the vulnerabilities, we're talking about here. Now we see this played out in fishing all the time.
03:06:13.860 --> 03:06:22.620
Erich Kron: This is one of the ways that they do it. This was one a couple years ago now during, you know, when we had a lot of the school shootings and here's the thing. Attackers
03:06:23.100 --> 03:06:28.770
Erich Kron: They don't care. They don't care about your feelings. They don't care about whether it's right or whether it's wrong.
03:06:29.070 --> 03:06:39.570
Erich Kron: They really don't. They'll do anything to bypass critical thinking. So as this was going on. Tensions were already very high. So we saw these going out. This is an actual email that went out.
03:06:40.560 --> 03:06:46.290
Erich Kron: And it went out to a college okay security alert reported on campus. This is what it is.
03:06:46.710 --> 03:06:52.770
Erich Kron: And what it did, you know, an emergency scare earlier reported kind of go through the memo released and follow the outline protocol.
03:06:53.130 --> 03:07:04.830
Erich Kron: In this case, what they did is it took him to a Microsoft login page. Now this particular college used Microsoft for their email provider. So, people didn't think it was unusual to log into
03:07:05.370 --> 03:07:13.800
Erich Kron: Their Microsoft account to be able to do it. Well the bad guys. Install the credentials and then they could get into this and use it.
03:07:14.550 --> 03:07:22.800
Erich Kron: For attacking other people from legitimate account. The other thing people use this for once they steal a set of credentials or even get one out of other breaches.
03:07:23.160 --> 03:07:32.430
Erich Kron: Is something called credential stuffing and that is where you take a username and a password that you know a person uses and you try it on different websites.
03:07:32.880 --> 03:07:43.740
Erich Kron: So let's say the you know the knitting forum that you're a member of gets hacked and they end up losing all of your credentials. If you use that same username and password on Amazon.
03:07:44.400 --> 03:07:52.200
Erich Kron: They're going to try to plug that into Amazon now they're in there and they can buy things on Amazon. Using your credit card or your email or even worse, banks.
03:07:52.800 --> 03:08:01.650
Erich Kron: People reuse passwords across the board. And this is a very dangerous thing. There's another derivative of that called password spraying
03:08:02.580 --> 03:08:12.900
Erich Kron: And that is where they take a username that they get from one of these areas, and they use. If you've ever seen that top like 10 common passwords thing that goes out every year.
03:08:13.200 --> 03:08:20.820
Erich Kron: You know 1234 or five they'll pair those together and try to do logins using that as well. So if you're using one of those standard ones.
03:08:21.150 --> 03:08:32.010
Erich Kron: That's a problem because password spring will let people into your account. But this is what they're doing to try to get this information. You can see they're dealing with fear. Here they're dealing with concern.
03:08:32.400 --> 03:08:41.670
Erich Kron: And this is what they're doing to do these attacks, you've also seen these ads clickbait their science to it, you know, the
03:08:42.930 --> 03:08:51.990
Erich Kron: Top 10 things that blah, blah, blah, blah, blah. Number four will blow your mind right this is actually scientific uses what's called pattern interruption.
03:08:52.470 --> 03:09:05.070
Erich Kron: Based on the information Gap Theory. So they give you some of the information, but not all of the information and then you want to know the rest of that information, you know it's it's the fo mo thing or fear of missing out if you will.
03:09:06.090 --> 03:09:12.000
Erich Kron: And it deprives us of the answer. Well now that makes us curious and we want to go click on things.
03:09:12.720 --> 03:09:24.210
Erich Kron: It's absolutely amazing how well these work. That's why you see so many ads that are just like this because they work, but we also see the bad guys do this in Malvern typing. It's called it's
03:09:25.290 --> 03:09:32.640
Erich Kron: Malicious advertising. So we'll see stuff like this right now with coven 19 going on, you'll see stuff that's, you know,
03:09:33.450 --> 03:09:41.130
Erich Kron: Get, get more information from the government here or blah, blah, blah. I can't believe there's this trick to getting your check sooner or
03:09:41.790 --> 03:09:48.330
Erich Kron: You know, we live in Florida. We haven't, haven't we been in the news a little bit on an unemployment website right
03:09:48.840 --> 03:09:51.930
Erich Kron: Here's a hack to getting your unemployment filed in Florida. Right.
03:09:52.320 --> 03:10:00.120
Erich Kron: These things are all going to get people to want to do this. Now, the bottom left when they're I have five things you want to know about click very need to know about clickbait
03:10:00.510 --> 03:10:07.290
Erich Kron: Number four will blow your mind, folks, that is a clickbait ad for clickbait just love that I have to include that one.
03:10:08.070 --> 03:10:22.740
Erich Kron: The other thing that these do often is leverage to have the major emotions that people have and that is outrage and anger. They are the two top emotions that will get people to perform an action. So if I can scare you.
03:10:23.850 --> 03:10:37.800
Erich Kron: Or anger you rather or make you outraged. I can get you to perform an action and you see this in the political world all the time, right, doesn't matter what side of the aisle. You're on. Can you believe so. And so did this click this link to go
03:10:38.430 --> 03:10:45.150
Erich Kron: And you know sign the petition and we need to confirm you so we need to know everything about you and your mom and your social and
03:10:45.480 --> 03:10:57.600
Erich Kron: And all of that. Those are ploys used to get that sort of information and it's very, very effective. If I can outrage or anger somebody it's going to be very, very effective to get that kind of information.
03:10:59.010 --> 03:11:10.440
Erich Kron: So we see this a lot, business email compromise. These are attacks that are meant to not necessarily have any attachments. Okay, this is like wire transfer fraud.
03:11:11.400 --> 03:11:17.190
Erich Kron: Hey, so and so, I need you to transfer some money to this place or that place because blah, blah, blah, blah, blah, they're pretending to be
03:11:17.580 --> 03:11:26.940
Erich Kron: The executive director or a CEO or something along those lines. There's no payload. In other words, the technical things, it's hard for them to see that because there's
03:11:27.270 --> 03:11:39.090
Erich Kron: You know, no malware attached and stuff like that. They tend to be very targeted. Now these oftentimes use what's called a docent or open source intelligence that stuff like LinkedIn, Facebook.
03:11:39.630 --> 03:11:46.830
Erich Kron: All the stuff that's out there on the web about you that we don't have to pay for that's open source intelligence and it is mind blowing.
03:11:47.220 --> 03:11:58.740
Erich Kron: How much is out there so they can target these things and personalize them towards people you have a CEO, for example, and on your page you want to show what a great person, the CEO is so you're like, oh, this CEO.
03:11:59.460 --> 03:12:08.460
Erich Kron: Supports the Humane Society, for example. Right. Well, now I know if I'm going to target somebody I may put together an email that says
03:12:10.020 --> 03:12:17.430
Erich Kron: You know it's supposed to come from the CEO and it's going to go to, say, a financial controller that says, hey, during this covert crisis, the Humane Society.
03:12:17.760 --> 03:12:29.790
Erich Kron: Is really struggling. I need you to send them 10 grand right now. I'll take care of the paperwork afterwards. But otherwise, it's going to be really ugly and you may realize that, oh, you know what
03:12:30.180 --> 03:12:38.280
Erich Kron: The CEO loves this sort of thing. Yes. Okay. And that's going to make it very, very effective. So this is the kind of attack that really doesn't
03:12:38.640 --> 03:12:46.050
Erich Kron: Doesn't do anything technical but makes a lot of money. We're talking billions of dollars every year that are being lost to these kinds of attacks.
03:12:46.590 --> 03:12:54.600
Erich Kron: Supply chain and invoice fraud. This is ones where people either make up invoices and send them out or more than likely they'll try to get into an account.
03:12:55.200 --> 03:13:01.470
Erich Kron: And when they get into the account what they'll do the email account what they'll do is they'll start sending messages.
03:13:01.800 --> 03:13:10.290
Erich Kron: With either fake invoices or real invoices and say, oh, I gave you the wrong email or the wrong account number, send the payment here instead. We see this all the time.
03:13:10.650 --> 03:13:18.720
Erich Kron: We also see it in business or in real estate transactions, both on the personal and commercial side we see this happen.
03:13:19.500 --> 03:13:29.010
Erich Kron: Over and over again, right, so it's time to send your down payment on your home. Here's the account you wire that to oh wait, we have the wrong account number of do it to this one instead.
03:13:29.730 --> 03:13:39.570
Erich Kron: That sort of stuff going on all the time right now happens commercial and personal there was someone in Florida lost 70 grand on their down payment for the personal home like that.
03:13:40.980 --> 03:13:51.090
Erich Kron: This is an interesting example. This is a 48 year old Lithuanian guy trick to American companies. It was Google and Facebook and acquiring them 100 million dollars.
03:13:51.510 --> 03:14:06.240
Erich Kron: He spun up some offshore companies that sounded similar to major vendors that had worked with these two companies and just started sending them invoices and they paid them to the tune of 100 million dollars before it all came out that. Wait a minute. This isn't right.
03:14:07.470 --> 03:14:20.640
Erich Kron: Very simple to do. But also, I mean, if you think about it, it's very low tech, but wow what a pay off right hundred million dollars that yeah that's nothing to scoff at. So what we've been seeing lately is this and I wanted to bring this up.
03:14:21.810 --> 03:14:33.150
Erich Kron: Has gone off the charts for people. This will give you an idea of what's been going on. People are working from home now. They're not in the same situation where they have corporate protections and things like that in place.
03:14:33.930 --> 03:14:41.340
Erich Kron: And it's gotten really really ugly. So if you look, you're starting in the week of March 8 to the 14th. There were 16 brand new.
03:14:41.850 --> 03:14:46.440
Erich Kron: Fishing templates that we could see that were strictly related to covert 19
03:14:47.400 --> 03:15:06.390
Erich Kron: So that means different kinds of attacks right brand new attacks 16 OF THEM THE NEXT WEEK, THE 15th through the 21st of March 36 brand new and the week after that, though, 22nd through 28th 94 brand new templates being used to attack people using
03:15:08.790 --> 03:15:15.870
Erich Kron: It's gone off the charts, the guy that we, one of the guys that we have that his job is to actually look through this. He put this chart together.
03:15:16.890 --> 03:15:27.810
Erich Kron: He said, a lot of the traditional spam groups, which is the sending you ads for things you don't want. Right. They've actually switched years and moved over to doing fishing.
03:15:28.320 --> 03:15:39.210
Erich Kron: Because they have that in place already with the ability to send emails, but it is so lucrative it is they've made the jump from that now on the plus side, I'm getting less phone calls about, you know,
03:15:40.020 --> 03:15:48.690
Erich Kron: Talking about my car warranty. Right. But it's definitely happening where people are using this to target. So how do you defend yourselves well
03:15:49.260 --> 03:15:53.820
Erich Kron: The first thing you need to do is understand if you're being manipulated. I like to tell people
03:15:54.510 --> 03:16:01.680
Erich Kron: If an email a text message or a phone call elicits an emotional response be very, very careful. Right. So in this case, we're looking at
03:16:02.250 --> 03:16:08.520
Erich Kron: The lawyers, they use our greed, like the Nigerian prince scam right that has been trying to give away money for years.
03:16:09.210 --> 03:16:21.210
Erich Kron: Curiosity, like the clickbait stuff self interest urgency urgency urgency. Now I have yet to see an email and I had somebody say they have, but I've never seen one.
03:16:21.630 --> 03:16:25.770
Erich Kron: That says, hey, whenever you might feel like getting around to it.
03:16:26.370 --> 03:16:34.050
Erich Kron: Go ahead and do this right, I've never seen a phishing email that's like that, you know, if you get a chance, why don't you go ahead and do this. No, no, no.
03:16:34.500 --> 03:16:42.450
Erich Kron: It's always urgency, there's something going on. There's somebody upset, there's some reason that this has to be done quickly, why
03:16:42.870 --> 03:16:49.140
Erich Kron: Because they don't want you to have the time to apply those critical thinking steps that we talked about, from the Moodle loop right
03:16:50.010 --> 03:17:02.700
Erich Kron: Think about the times you've been upset or angry and what kind of decisions do you make in that mind frame. Right. Are they good decisions, usually. Are they well thought out decisions. No, not typically right
03:17:03.210 --> 03:17:09.960
Erich Kron: That's what they're that's what they're leaning on here, of course, fear, you know, we have those ones that are going around that.
03:17:11.040 --> 03:17:24.330
Erich Kron: That talk about the government's going to come here. The IRS is going to come get you. You know, unless you pay the IRS and itunes gift cards, really. Let's think about that in the next, you know, 12 hours. We're going to come arrest you.
03:17:25.620 --> 03:17:27.660
Erich Kron: Those generate a lot of fear. Right.
03:17:28.710 --> 03:17:36.330
Erich Kron: And it gets people to overlook the fact that you're paying the IRS and itunes gift cards right I'm only imagining every IRS agent has like every song in
03:17:36.780 --> 03:17:45.450
Erich Kron: ITunes already to them. I don't know. But then helpful in this to again when you see stuff like this, you're going to see like coven you're going to see fake
03:17:46.230 --> 03:17:54.810
Erich Kron: Charities you're going to see things like that people may even as a not for profit may even try to spoof you to get people to take money.
03:17:55.230 --> 03:18:04.230
Erich Kron: And give to another organization thinking they're helping you out. Right. But that's another kind of attack that we see if you have any of these be super, super cautious with them.
03:18:04.830 --> 03:18:14.070
Erich Kron: And try to step back, take a deep breath and go, does this really make sense. Is there really a Nigerian prince wanting to send me money. Right.
03:18:15.450 --> 03:18:23.700
Erich Kron: Those, those are the key things there. Now we have this. It's our 22 red flags, you can actually get a PDF, it's, it's free for download from us.
03:18:24.660 --> 03:18:32.580
Erich Kron: Red flags and essentially what it's designed to do is get people to look at an email and go, you know what, okay, these are some things I can look at
03:18:32.880 --> 03:18:40.860
Erich Kron: Is the reply to address right if I hover a link. Does it actually go to where it says it goes, you know, and I like to tell people it's as easy as going
03:18:41.640 --> 03:18:50.130
Erich Kron: You know 2am on Saturday I got an email from the HR department. And that's unusual. Maybe I should be real careful with this one.
03:18:50.730 --> 03:18:56.700
Erich Kron: That's kind of the thought process. You want to have with this, but this PDF is fantastic and gives you a way to kind of
03:18:57.000 --> 03:19:08.220
Erich Kron: quickly look at these things and go okay these are some things I should check on. I love this document for that. As a matter of fact, a lot of us know stirs at know before we have this printed out and in our cubes in the office as well.
03:19:08.970 --> 03:19:17.820
Erich Kron: So does training work. Yes, yes, it does. It's a great actually place to to put some of your resources. All right.
03:19:19.320 --> 03:19:29.190
Erich Kron: We did this study. This was 4 million users. We tracked him for a year. Initial click rates on phishing emails 37.9% on these 4 million users.
03:19:29.580 --> 03:19:46.560
Erich Kron: In about three months. And that means training and doing one or two those simulated phishing attacks. It was down to 14% so way over half a reduction by over half and then at the end of the year, they were down to 4.7% that is a ridiculous.
03:19:47.580 --> 03:19:57.570
Erich Kron: Change in people from almost 38% to to less than 5% and frankly I see less than this. A lot of times I see it down in the one in 2%. Now, it's important to understand
03:19:57.960 --> 03:20:08.070
Erich Kron: We don't say we're a solution. There is no silver bullet to stop the fishing thing. All right. You need layers in place. You need email gateways.
03:20:08.520 --> 03:20:20.010
Erich Kron: To help reduce the amount that's getting in Google just released something said they are blocking 18 million code 19 themed phishing emails every single day.
03:20:20.880 --> 03:20:28.770
Erich Kron: 18 million per day, just for coven 19 right so you need that kind of stuff in the front end, you need to you know get that there
03:20:29.220 --> 03:20:37.830
Erich Kron: And then you also need stuff on the other end as well. You need good antivirus on your computers. You need to make sure that if somebody does click on something.
03:20:38.670 --> 03:20:46.440
Erich Kron: That the antivirus can catch it. You also need to have really good backups in place because ransomware is the thing now and it's pretty ugly, right.
03:20:47.070 --> 03:20:55.260
Erich Kron: So there's these layers, you have to have in place now the user is kind of the place where it hits from being proactive to reactive. In other words,
03:20:55.620 --> 03:21:01.800
Erich Kron: All of these things we're trying to stop from getting to the user. Some of them are going to get through. Nothing is 100%
03:21:02.310 --> 03:21:09.750
Erich Kron: And then the user if they click on that. Now you're relying on your antivirus your Endpoint Protection. Now you're relying on
03:21:10.110 --> 03:21:22.530
Erich Kron: These other tools to try to recover from the fact that the user clicked on it. And again, I'm not blaming the user, it's not their fault, but they are the target. And that's why we want to really try to help people.
03:21:23.790 --> 03:21:28.620
Erich Kron: Learn how to spot these things and protect themselves, not only does it help them at work, it helps them at home.
03:21:29.100 --> 03:21:37.500
Erich Kron: Once you learn to spot scams, you can spot them every day, kind of like that magic trick that I showed you, once you know how to spot these things.
03:21:37.890 --> 03:21:45.390
Erich Kron: It makes it a whole lot easier to stay safe at home. So this is why it's very important. I do see a question here about is there a link to the document.
03:21:45.990 --> 03:21:57.510
Erich Kron: I can get a link to that red flags PDF for y'all. You can also google it know before 22 red flags PDF and it should pop right up on top of Google.
03:21:58.380 --> 03:22:08.790
Erich Kron: So the way we work. It is we have basically four steps. The goal is to make employees have employees make smarter securities decisions every day. We do a baseline test we train people.
03:22:09.180 --> 03:22:17.070
Erich Kron: We do simulated fishing and then we see the results and as they get better, we make them a little bit tougher more like what's really happening out there.
03:22:17.430 --> 03:22:23.550
Erich Kron: And it's a continuous thing. You don't want to train people once a year and think they're going to know about it for the rest of the time.
03:22:25.770 --> 03:22:31.620
Erich Kron: You want to fish people at least once a month with the simulated ones and again it gives them a chance to practice what you teach them.
03:22:32.040 --> 03:22:39.120
Erich Kron: Where if they mess up. It doesn't really hurt the organization and they have a chance to learn from that. That's the whole idea. So if somebody clicks on a link
03:22:39.720 --> 03:22:50.010
Erich Kron: And it's not a legitimate link, it will take them to what we call a landing page and give them some point of failure training. You can also automatically enroll someone in like a five minute
03:22:51.060 --> 03:23:01.050
Erich Kron: Type of training from that as well. But you want to see the results. And again, continue to make it tougher. But it's really very, very easy. The beauty about the system right now is
03:23:02.580 --> 03:23:13.140
Erich Kron: Even if people are dispersed and working from home a it's an expensive and be we can deploy it, because it's an online on demand sort of deal we can deploy the training within a day or two.
03:23:13.590 --> 03:23:21.630
Erich Kron: It's very, very easy to do. It's one of the easiest things you can do in security. That's one of the reasons I'm so passionate about it because the impact is so high.
03:23:22.200 --> 03:23:29.730
Erich Kron: So having said that, that's about all I got four slides. This is my contact information, feel free to reach out to me, follow me on Twitter, there
03:23:30.000 --> 03:23:41.460
Erich Kron: I do more on the at error prone Twitter account that I do the the K before, Eric, but feel free to reach out to me, however you like. Always happy to answer questions. So, Erica. You got anything for us.
03:23:42.780 --> 03:23:47.910
Erica Woods: So I don't know if you break down what ransomware was, I know you
03:23:47.910 --> 03:23:51.150
Erica Woods: Referenced that might be a helpful concept to share
03:23:51.330 --> 03:24:12.150
Erich Kron: Cool, yeah. Ransomware is the scourge of everything right now. Frankly ransomware is basically a type of malware or malicious software that gets in systems and it uses encryption, the same stuff that keeps us safe to basically take over and take hold of all of your data.
03:24:13.290 --> 03:24:24.570
Erich Kron: And then they charge you a ransom to give you the keys to decrypt the data. Okay, so basically they take all of your stuff and say, you know, you can't work until you fix this now.
03:24:24.960 --> 03:24:41.910
Erich Kron: This has been a huge, huge issue. We saw it. A few years back with what was called wanna cry that took out the national health system in in Britain we saw what Baltimore Atlanta got hit with it, but it also hits tiny little organizations as well.
03:24:43.650 --> 03:24:52.320
Erich Kron: And it, it goes after individuals. So imagine you download something on your machine you double click this file, whatever. And now it pops up a thing and says,
03:24:52.830 --> 03:24:57.750
Erich Kron: All of your data is ours. You can't, you know, you have to pay us 500 bucks to get it back or whatever.
03:24:58.380 --> 03:25:07.410
Erich Kron: You could lose all of the pictures of your kids like all the digital stuff. They're all of this stuff that you have to work for, you know, tax records all the stuff that we keep digitally on computers now.
03:25:07.890 --> 03:25:15.210
Erich Kron: And it's really smart. A lot of times they will target backups, if they can find a backup. They will also encrypt that backup. So you can't restore it.
03:25:15.780 --> 03:25:24.780
Erich Kron: And we're even seeing strains. Now, what they're doing is because people are getting better at preventing it through restoring backups like not having to pay
03:25:25.560 --> 03:25:32.760
Erich Kron: What they're doing is they're actually export trading data pulling data out of your system and then encrypting it
03:25:33.600 --> 03:25:41.070
Erich Kron: And then they're saying if you don't pay us. We're going to dump this on the internet. So there was a bunch of sensitive information and stuff being publicly released
03:25:41.490 --> 03:25:51.510
Erich Kron: As well. So it's a really, really nasty thing I do hour long seminars, just on ransomware but it impacts everybody and it's absolutely
03:25:53.130 --> 03:25:55.830
Erich Kron: It's absolutely horrible to have to deal with. It really is.
03:25:57.540 --> 03:26:11.850
Erica Woods: Thank you so much for that. So Austin's questions he's got a couple. Do you have any thoughts on bolt on services to banking IE QuickBooks mint and others which require linking bank access
03:26:14.730 --> 03:26:16.320
Erich Kron: That's a tough one because
03:26:17.340 --> 03:26:28.050
Erich Kron: You know, I think we always have to be cautious when it comes to things like that. Here's the deal. Every piece of software out there is going to have vulnerabilities. We've heard a lot about zoom, zoom this soon that right
03:26:29.670 --> 03:26:37.260
Erich Kron: Zoom was found out to have vulnerabilities, because all of a sudden everyone and their brother was using it right they jumped to user accounts like huge
03:26:37.590 --> 03:26:47.940
Erich Kron: In no time at all. Well, then the security people, including the good guys are going to turn on that and they're going to start trying to security, test it, right. That's what we call white hat hackers.
03:26:48.330 --> 03:26:56.640
Erich Kron: They're looking for vulnerabilities and reporting them. That's why all of a sudden you saw all of these vulnerabilities and zoom. Well, you're going to have the same thing and just about any software.
03:26:57.930 --> 03:27:02.850
Erich Kron: So you have to understand there is always a risk if you're connecting things to your bank accounts, right.
03:27:04.050 --> 03:27:16.410
Erich Kron: It's a matter of looking at is this an established organization that you're doing this with. Are you willing to take the risk. And what do you do if it does go all wrong. Right, that's, that's kind of the key things you have to ask yourself.
03:27:17.610 --> 03:27:27.930
Erich Kron: I try not to connect too many things in there. But I mean, I'll be honest, I have a PayPal account it links to my USA checking account and can move money out right
03:27:28.860 --> 03:27:37.680
Erich Kron: But I'm really cautious I protect my PayPal account with just as much security as I do my bank account. So I have multifactor authentication.
03:27:38.220 --> 03:27:45.180
Erich Kron: There, which is, you know, when I go to login. I have to either pull up an app on my phone and get a number off that or it's going to text me a number.
03:27:45.480 --> 03:27:53.100
Erich Kron: It's something other than just my username and password in order to log in. So it's a matter of having those protections in place like that as well.
03:27:53.580 --> 03:28:03.900
Erich Kron: Because honestly, the least secure way is going to be the way that you know that that's going to be easiest to get into your account, whether it's your actual bank account or a third party.
03:28:05.070 --> 03:28:21.390
Erica Woods: Great. So let's do Austin. Second question, and then let's revisit some years questions. Um, so the second question from Austin. Do you have any advice on using ends or choosing a password manager on our browser based password managers, a good idea.
03:28:21.900 --> 03:28:29.910
Erich Kron: Haha, I love password managers, let's just put that out there right now. I love password managers.
03:28:30.570 --> 03:28:42.780
Erich Kron: password managers are essentially a way to deal with all of these passwords that we have to have right all of us probably have 30 4050 some of us 100 or more passwords, we have to deal with.
03:28:43.260 --> 03:28:53.490
Erich Kron: And so security professionals, we go around and say it has to be unique. It has to be long. It has to be complex, and you have to, you know, don't write them down. I mean,
03:28:53.850 --> 03:29:10.470
Erich Kron: It's an impossible task. Right. That's where password managers step in. Now what I love about password managers and I use last pass, and I also use one password for some other things. Okay, but last pass is my is my big one that I use for most things
03:29:11.580 --> 03:29:19.650
Erich Kron: It is an online one and it shares on my cell phone and shares on computers different computers. So I have access to these passwords just about anywhere.
03:29:20.430 --> 03:29:30.420
Erich Kron: Now I go to sign up for a new account and asked me for my username. I can put that in there. If it doesn't need my email address as the username. I may put something random in there.
03:29:30.840 --> 03:29:37.350
Erich Kron: And the password manager keeps that okay then when it comes time to generate a password. How many of you have sat there going,
03:29:37.980 --> 03:29:42.900
Erich Kron: Oh, great. I got to come up with some other passwords. What do you do you reuse the one they use the most. Right.
03:29:43.350 --> 03:29:52.650
Erich Kron: Well, what you do here is you can go to your password manager and let's say it says six to 40 characters. Well, nobody's ever going to in their own mind to come up with a 40 character.
03:29:52.980 --> 03:29:57.000
Erich Kron: But you pull that slider over and there you say generate password with all this kind of stuff.
03:29:57.510 --> 03:30:03.300
Erich Kron: You paste it in and you create the account with that it's automatically saved now in your password vault.
03:30:03.780 --> 03:30:11.820
Erich Kron: Now, when you go back to that page. Sometimes it'll automatically fill it in for you and you know you hit a little thing and say yes fill this in. Or you may open your, your
03:30:12.480 --> 03:30:18.000
Erich Kron: What's called the vault and copy that and paste it in Fantastic, fantastic tools again.
03:30:18.360 --> 03:30:25.620
Erich Kron: Protect them with multifactor or whatever so that you know you don't just use a username and password to get in and that should be your very tough one.
03:30:26.220 --> 03:30:35.070
Erich Kron: But I love that browser based ones. There are also ones like what's called key pass, which I've used in the past. The problem is
03:30:35.460 --> 03:30:49.200
Erich Kron: If something happens to that machine and all of your passwords are on that machine, and they all go away. You're in a world of trouble. Now, is there a risk to having passwords on in the cloud. Yes.
03:30:49.860 --> 03:31:02.310
Erich Kron: However, not too long it. I won't say was a year, maybe two years ago last pass was quote unquote hacked. The thing is they use your password to encrypt things before it ever gets to them.
03:31:02.940 --> 03:31:10.350
Erich Kron: So even they can't decrypt what's in your vault up there. So then being hacked is not really a problem.
03:31:10.620 --> 03:31:18.330
Erich Kron: If the person got all these passwords are the bad guys got all these problems are passwords and you have all these long passwords that are encrypted well
03:31:18.720 --> 03:31:26.040
Erich Kron: It's going to take them centuries to be able to try to get into these and by then I would really hope that you've changed your Facebook password. I mean,
03:31:26.490 --> 03:31:33.540
Erich Kron: So there's pros and cons to each. Some people don't like things in the cloud, personally, looking at the security reviews that have gone on.
03:31:33.870 --> 03:31:43.350
Erich Kron: One password and last pass or two fantastic ones last passes even free. I'm not sure about one pass. But they're great solutions for that sort of thing.
03:31:44.670 --> 03:31:51.720
Erica Woods: Awesome. I think you just answered the majority of the questions that Samir had his first two, but let me read them, just in case.
03:31:52.320 --> 03:32:04.920
Erica Woods: There is something else you want to throw in here for advice. He wanted to know about a manageable manageable password strategy. How often should you change your passwords and using a random password. Do you put
03:32:05.520 --> 03:32:06.330
Erica Woods: On all that
03:32:06.420 --> 03:32:09.420
Erich Kron: Well, there's arguments about the how often to change passwords.
03:32:09.450 --> 03:32:09.690
03:32:11.100 --> 03:32:15.300
Erich Kron: Changing passwords in a corporate environment can be an absolute nightmare.
03:32:16.500 --> 03:32:25.800
Erich Kron: Because you know when we're talking about servers and things will change your password on a service that takes out things. We had no idea it was going to do that so
03:32:26.400 --> 03:32:32.850
Erich Kron: You know, it can be very ugly sometimes if you have a long and unique password.
03:32:33.840 --> 03:32:41.130
Erich Kron: In other words, you don't use it anywhere else. You don't use it on your, your knitting forum where if they get hacked and they're going to reuse it. Another place.
03:32:41.790 --> 03:32:49.770
Erich Kron: If you have a unique one. I honestly believe that changing your password is not as critical on a timeframe.
03:32:50.190 --> 03:32:59.220
Erich Kron: As it was in the past. The reason you change it is because somebody may get a hold of it over here. Or they may brute force it, which is where they take a computer and try
03:32:59.640 --> 03:33:16.290
Erich Kron: Try to break it down, essentially, but if you have a strong one. It's going to take them years. So what maybe every year. So you change your password. I personally am okay with that as long as it's a unique and long one. That's complex
03:33:17.700 --> 03:33:23.610
Erica Woods: Okay, great. And then I want to read the scenario. He shared because I think it's probably relevant for everyone.
03:33:24.390 --> 03:33:35.970
Erica Woods: You hit on advice already on this, but in case you have anything extra. So I've received emails lately from a different person mean each time, but all structured the same
03:33:36.240 --> 03:33:46.230
Erica Woods: Seeing that they know my password, they typed it into the email and they know even more about me and they're going to expose more details if I don't respond to their email.
03:33:46.710 --> 03:33:54.510
Erica Woods: I've been ignoring the emails, but should I address it, should I report it. And more importantly, how do I prevent it from happening.
03:33:56.160 --> 03:34:07.980
Erich Kron: Yeah, so we've, we've seen a rash of these that really started with what's called sex stores and emails. A couple of years ago. This is where we really saw these come into play. And it was
03:34:08.700 --> 03:34:21.720
Erich Kron: You went to a website and you know a not good website and we turned on your camera while you were there and now if you don't pay us. We're going to release this video footage
03:34:22.890 --> 03:34:30.330
Erich Kron: To everybody. And by the way, just to tell you we know your password is XYZ right and when this first happened with people panicking.
03:34:31.320 --> 03:34:36.240
Erich Kron: And you know, I heard about it all the time. Oh my gosh, that my password. Well, here's the scary part about this.
03:34:36.840 --> 03:34:44.460
Erich Kron: In those traditional ones and those ones back then the passwords. They were using when this first came out, were from a 10 year old LinkedIn breach.
03:34:45.000 --> 03:34:51.720
Erich Kron: Which is another problem. If somebody comes up and goes, Oh my gosh, they have my password, and it's a 10 year old password, you're still using that's a problem, folks.
03:34:52.440 --> 03:35:00.900
Erich Kron: But they do this to try to scare people remember the fear piece, right. So they're trying to prove that they have this information, but most of the time.
03:35:01.410 --> 03:35:10.590
Erich Kron: That's going to come out of a another data breach from another source. If you want to learn some things about yourself. Take your email password or you're not your password.
03:35:10.980 --> 03:35:19.530
Erich Kron: Take your email address and go to have I been postponed pw any tea. I think it is. Have I been poland.com
03:35:19.950 --> 03:35:29.730
Erich Kron: It's a website by Troy Hunt and what it'll do is he takes all of these breaches and puts them in a database. And you can see if you've been a victim of
03:35:30.210 --> 03:35:44.070
Erich Kron: A breach at some point in time. Well, odds are, if you were in a lot of cases, they have passwords and they will put those in there from these breaches, just to try to make it relevant to you. So here's what I suggest is ignore them.
03:35:45.120 --> 03:35:46.020
Erich Kron: They're not real.
03:35:47.310 --> 03:35:48.720
Erich Kron: If you want to
03:35:50.070 --> 03:35:54.480
Erich Kron: You can report them to like the icy three which is the FBI is web page. I see three
03:35:55.470 --> 03:36:04.740
Erich Kron: But nothing's really going to happen with that, to be honest with you, these things. The, the source the emails, the website to send it to they pop up and down so quickly. It's not even going to happen.
03:36:05.490 --> 03:36:12.810
Erich Kron: As far as preventing it. Not going to happen. Unfortunately, the Internet and email itself is designed in such a way that
03:36:14.040 --> 03:36:15.480
Erich Kron: It's very, very
03:36:17.100 --> 03:36:25.890
Erich Kron: It's more about resilience than it is security. And so it's very hard. Unless you enter in a corporate environment where maybe you want to apply.
03:36:26.250 --> 03:36:34.950
Erich Kron: What's called SPF records or sender policy framework and D mark and D. Kim, which are a couple of technologies that really tried to
03:36:35.700 --> 03:36:43.980
Erich Kron: Limit the spoofing there really isn't much you can do when somebody sends you an email. It says there's somebody they aren't the emails, just not designed to be secure that way.
03:36:45.300 --> 03:36:48.870
Erica Woods: Great. We've got to and kill um
03:36:50.070 --> 03:37:06.450
Erica Woods: Gosh, I'm so glad Austin, you brought this up because this literally just happened to one of my best friend's parents, but the question is I always worry about my elderly family members accidentally downloading key logger malware. Is that a reasonable thing to worry about.
03:37:07.710 --> 03:37:15.360
Erich Kron: Yeah, it still happens. And what will happen is it's generally not key logger like the old key loggers which is everything you type
03:37:15.900 --> 03:37:24.090
Erich Kron: But we see I think its stride X and some of the other ones. They're usually focused around banking. So when you go to a banking website.
03:37:24.630 --> 03:37:39.930
Erich Kron: It will start recording everything you type in IE username password, all that kind of stuff. Right. And then it sends it back. It is a real threat is a real problem. So, to avoid this a make sure they have really good Endpoint Protection on their computers.
03:37:41.220 --> 03:37:45.570
Erich Kron: Believe it or not, Windows defender is not bad, it's actually a halfway decent product.
03:37:46.560 --> 03:37:58.230
Erich Kron: And I hate myself for saying that. But it's free. It comes with things. It's actually pretty darn good. There's also. So, folks, has some decent stuff out there that's free. You can you can set these things up on there.
03:37:59.430 --> 03:38:08.580
Erich Kron: We actually have a home course and a family version we put out a bunch of stuff it with coven 19 and see if I can get you a link on this too.
03:38:09.450 --> 03:38:28.680
Erich Kron: But it's training for people that are you know kids to to elderly to help them avoid that, too. And I find it interesting up in my, my mother and father in law cannot be any more different. Okay. My mother in law. She's the one that downloads every possible iteration of
03:38:29.700 --> 03:38:36.000
Erich Kron: You know may Zhang or whatever, whenever she sees it. My father in law. On the other hand, is the guy that
03:38:36.570 --> 03:38:43.530
Erich Kron: Wants to turn off the computer every single night because he's afraid if he leaves it connected to the internet. It's definitely going to get hacked by the Russians.
03:38:43.800 --> 03:38:48.690
Erich Kron: They live in the same household. It is a crazy relationship. I do a lot of tech support for them.
03:38:49.410 --> 03:38:53.910
Erich Kron: But I do find all kinds of malware on their machines from my mother in law downloading these things.
03:38:54.600 --> 03:39:10.140
Erich Kron: So it's important to teach them as much as you can, but it is a real threat. Your Endpoint Protection is probably going to be some of your better protection against that occasionally, you may want to run some anti malware software on their on their machine for them.
03:39:11.730 --> 03:39:24.570
Erica Woods: Great, thank you. So next question from Stan for multifactor authentic to cation, which methods are the most or least secure text email.
03:39:25.650 --> 03:39:26.700
Erica Woods: App, etc.
03:39:27.210 --> 03:39:30.870
Erich Kron: Okay, so when you ask about most secure and least secure
03:39:32.340 --> 03:39:37.500
Erich Kron: That that's an interesting topic because most secure is not where most people should be
03:39:38.070 --> 03:39:43.890
Erich Kron: I'll be perfectly honest about that the most secure. I've seen and I've dealt with was when I was with the army.
03:39:44.520 --> 03:39:52.680
Erich Kron: We had multifactor in some certain rooms that we had to go through that we had classified things in, I would have to use my fingerprint.
03:39:53.190 --> 03:39:59.310
Erich Kron: I would have to use what's called a cat card. It's a smart card that I had that had windows credentials on it. I would have to put that in the reader.
03:39:59.670 --> 03:40:07.650
Erich Kron: I would have to unlock it with a pin number. Okay, that's three different things. I had to have my fingerprint a PIN number and the card in my hand right that's three factor.
03:40:09.450 --> 03:40:16.980
Erich Kron: That is not something that most people should have, but it is one of the most secure right so let's put that in, in a different if you don't mind me. Rephrasing this
03:40:17.490 --> 03:40:29.100
Erich Kron: What is the best for most people. Now, most people are familiar with the text message we get from the bank. The banks have taught us to do that. We already have our cell phones with us all the time.
03:40:29.520 --> 03:40:38.280
Erich Kron: Getting a text message in order to login, we've already been trained through the years to do that. It is the least secure it is the easiest to get people to adopt.
03:40:39.120 --> 03:40:51.570
Erich Kron: What I like to see is kind of in the middle between a text message and having to physically have a token, like a USB key or something like that, although I protect my last past with the UB K which is a hardware key.
03:40:52.650 --> 03:41:03.450
Erich Kron: In the middle is the apps that we have on smartphones, like Google Authenticator or one of those sorts of deals. What it does is it generates a code that resets every 30 seconds or so.
03:41:04.110 --> 03:41:16.290
Erich Kron: If you've ever used those it's terrifying when it starts flashing red and all that kind of good stuff, but it is one step up from having that text message because those can be intercepted there's ways to get past that.
03:41:17.550 --> 03:41:23.370
Erich Kron: But what it does is it it prompts you to enter that number and you have 30 seconds or resets the number on your phone.
03:41:23.910 --> 03:41:37.320
Erich Kron: Since we mostly have smartphones, these days. Anyways, we already have the devices on us that we will be getting text messages for. It's not a huge step to get to that point, but is far more effective than just text messages.
03:41:38.970 --> 03:41:56.040
Erica Woods: Awesome. I'm going to ask one last question. And then if there are any additional questions, please send me, you know, a message offline. And, you know, we can work on trying to get those answered for you, but last one. Any thoughts on Endpoint Protection for mobile phones.
03:41:56.520 --> 03:41:56.850
03:41:58.050 --> 03:41:58.470
Erich Kron: I'm
03:41:59.760 --> 03:42:09.570
Erich Kron: Tim for me, mobile phones are not, they're not the biggest issue and I know there's some people out there that do it. What does it look out
03:42:10.920 --> 03:42:18.180
Erich Kron: Is one of them. I know there's some out there, it, it really depends on what you use your mobile for. So here's the deal. I have an iPhone.
03:42:18.690 --> 03:42:30.090
Erich Kron: I can basically only get things from the Apple store right and Apple is really good about killing things that are malicious very, very quickly, they do a pretty good job with that.
03:42:31.170 --> 03:42:37.110
Erich Kron: Android, what we see is most of the problems caused by Android are people going to third party.
03:42:38.340 --> 03:42:50.850
Erich Kron: app stores or what's called side loading AP case putting their own software on there, they kind of have to go out of their way to do that. I personally don't run Endpoint Protection on my phone.
03:42:52.260 --> 03:43:03.240
Erich Kron: I have in the past with lookout because look out also offered some tracking stuff. My wife wants left her phone on a tailgate of a pickup truck while they were fishing.
03:43:04.200 --> 03:43:14.790
Erich Kron: And while this was going on, somebody walked by and snatched it now. They were like putting stuff up. You know, there's traffic around some of these there's somebody snatched it and I was able to actually trace it to
03:43:15.240 --> 03:43:20.940
Erich Kron: Where the person had taken it to their house through through that and it was fantastic for that.
03:43:21.450 --> 03:43:31.680
Erich Kron: But I never saw anything in ANTIVIRUS THAT WOULD HAVE concerned me from that standpoint. The other option is you need to have something that can wipe device.
03:43:32.280 --> 03:43:44.580
Erich Kron: Remotely as well. So if you do lose something you're able to basically brick it and remove all your personal information that's where I see the protection being valuable on phones, more so than say antivirus.
03:43:45.810 --> 03:43:54.810
Erica Woods: Grid, Eric, I can surely speak on behalf of everyone. When I say thank you so much for all of this wonderful education.
03:43:55.170 --> 03:44:07.710
Erica Woods: That you've provided on the nonprofit community and volunteers that help out with nonprofits I included some of the links to what Eric recommended including 922
03:44:08.250 --> 03:44:20.790
Erica Woods: Tips for emails, as well as the password manager, Stan. Thank you for providing the link to have I been pond. So make sure you save these links and scope those out.
03:44:21.510 --> 03:44:33.180
Erica Woods: The last two things I want to say is we did record today's session. So I would highly, highly, highly encourage you just simply to remember that we have a recording of today's session.
03:44:33.540 --> 03:44:41.670
Erica Woods: And for anyone else who volunteers or works at your nonprofit share that recording share the links that we provided
03:44:42.240 --> 03:44:55.920
Erica Woods: With those folks maybe encourage it to be part of your onboarding process because this is a topic. ERIC, I'M SURE YOU'RE GOING TO SAY AMEN. It is not talked about enough and people just do not understand the basics.
03:44:56.370 --> 03:45:01.050
Erica Woods: And it is heartbreaking. We've even had a nonprofit that came to one of our meetings.
03:45:01.470 --> 03:45:15.990
Erica Woods: Um, and they had malware on their computer and had no idea. And luckily we have a number of folks in the security world that are active with our tech for good groups through text tube and they were able to get that removed and make this individual where they had not aware
03:45:16.530 --> 03:45:35.130
Erica Woods: So again, really educate other folks in your circle spread the wealth and also keep in mind. We've got six other recordings I included the link. But if you have an interest in search engine optimization in WordPress give which is the donation plugin. One of them for WordPress websites.
03:45:36.210 --> 03:45:40.950
Erica Woods: We've got topics that you could go back and access 45 to 60 minute recordings
03:45:41.280 --> 03:45:54.930
Erica Woods: And then the last thing for me. We've got two other events on our calendar right now that are available for everyone because they're in the form of webinars like today. Next week, we're actually doing how a crash course on how to use Adobe spark.
03:45:55.680 --> 03:46:06.510
Erica Woods: To make visual campaigns and images and Adobe spark actually just announced this week that their product is going to be free for two months is amazing timing.
03:46:07.230 --> 03:46:16.080
Erica Woods: For that webinar unintentional. And then we're doing in May, we're doing a session on how to make sure your nonprofit website is accessible.
03:46:16.410 --> 03:46:28.230
Erica Woods: It is a requirement. Now that your website is accessible for everybody. And that's another area that nonprofits just simply are not educated about Eric. Any last words from you.
03:46:29.400 --> 03:46:30.480
Erica Woods: And no nothing.
03:46:30.480 --> 03:46:41.340
Erich Kron: Other than, you know, just be, be careful out there. We just see so many attacks happening right now with the chaos of moving people to home and all that.
03:46:42.000 --> 03:46:56.820
Erich Kron: Just be careful. Try to warn your people to be very, very careful with the emails and if they have that, you know, like I said, the spidey sense sort of tingle. Please make sure they act on it. Take a deep breath and look at it because man is just getting crazy right now.
03:46:57.930 --> 03:47:00.150
Erica Woods: Now, thank you so much for that.
03:47:00.660 --> 03:47:12.000
Erica Woods: Thanks for everyone for tuning in. We hope to catch you in one of our next couple webinars over the next month or so. And as always keep us posted Eli, who runs the next word tech for good group locally.
03:47:12.270 --> 03:47:18.330
Erica Woods: Again, my name is Erica. I'm an organizer for the Tampa group. I'm also still involved with the Baltimore group.
03:47:19.050 --> 03:47:36.780
Erica Woods: Keep me posted, especially if you're in either of those two cities on what else we can be doing to help support your nonprofit right now or just any questions from a technology standpoint, and we'll make sure we get the right people looped in thanks, Eric, they seen you. Bye everybody.