WEBVTT

00:00:00.386 --> 00:00:11.187
- We're happy to welcome all our guests. We have seven guests today. We would love it if you would stand

00:00:11.187 --> 00:00:21.673
- so we could greet you with some applause. Please patience with my pronunciations. So please welcome

00:00:21.673 --> 00:00:29.118
- a guest from the OCE Lab of Civic Engagement, Shilpa Nandiala. Shilpa?

00:00:30.146 --> 00:00:42.789
- Oh, are you observing? All right, follow up to last week's speaker already. Guest of Mike Baker, David

00:00:42.789 --> 00:00:55.063
- McFadden. Kyla Cox-Deckard is welcoming three guests with her from the Center for Rural Engagement,

00:00:55.063 --> 00:00:57.150
- and Elise Jenke.

00:01:00.802 --> 00:01:20.686
- Gina Vertrees and Milan Gaston. Another guest, Becky Wan, is a guest of Mark Peterson. Becky. And a

00:01:20.686 --> 00:01:27.646
- guest of Michael Wade, Megan Wade.

00:01:32.002 --> 00:01:41.373
- Welcome. If you have any questions about Rotary, feel free to ask the folks at your table. And do we

00:01:41.373 --> 00:01:51.023
- have any guests online? Hey, Hank. Oops. OK. Hello. Yes, Megan Wade is joining us online, so as a guest

00:01:51.023 --> 00:01:55.198
- online. And other than that, just Rotarians.

00:01:59.426 --> 00:02:06.449
- saying any of you who'd like to learn more about Rotary, just turn to someone at your table and ask

00:02:06.449 --> 00:02:14.034
- away. Birthdays, we have several birthdays to celebrate. On the 11th, past club president and past district

00:02:14.034 --> 00:02:21.619
- governor Lance Eberly. On the 12th, Forrest Gilmore. On the 14th, past club president and district governor

00:02:21.619 --> 00:02:27.870
- Judy Witt. And on the 15th, Erica Kovacs. One anniversary to observe, Judge Jeff Bradley

00:02:28.130 --> 00:02:35.044
- Five years with our club, 12 years total as a Rotarian. We have a number of announcements. Please join

00:02:35.044 --> 00:02:42.092
- the members of the Rotary Book Club following the regular celebration of service at 1 p.m. next Tuesday,

00:02:42.092 --> 00:02:48.805
- right after the meeting. They will have a brief meetup to field questions that Rotarians might have

00:02:48.805 --> 00:02:54.846
- about the group, and they'll also decide what the next book will be and the meeting date.

00:02:55.650 --> 00:03:03.281
- The Rotary District Conference registration is now open, scheduled for May 8th and 9th at the Gold House

00:03:03.281 --> 00:03:10.549
- in Louisville. And you can learn more and register at rotaryallstars.com and there should be a link

00:03:10.549 --> 00:03:18.325
- in the roundabout. Save the date for the Wonder Lab Summer Blast Off on May 21st, an afternoon and evening

00:03:18.325 --> 00:03:21.886
- street party celebrating the last day of school.

00:03:22.530 --> 00:03:29.535
- Loomington Rotary Foundation is the primary event sponsor, and we'll need some club members to volunteer

00:03:29.535 --> 00:03:36.673
- that day. It should be a fun time, and if you want to learn more about it, the link will be in roundabout.

00:03:36.673 --> 00:03:43.610
- Don't forget that during meeting times, club members can also park for free at the Henderson or Atwater

00:03:43.610 --> 00:03:49.214
- garages. I parked at Atwater today. Campus sidewalks are clear. It was a nice walk.

00:03:49.410 --> 00:03:57.003
- If you want to avoid some of the parking problems at the Union parking lot, or you don't want to navigate

00:03:57.003 --> 00:04:04.597
- all the students in front of the Union, both Henderson and Atwater are good choices. So I think I crossed

00:04:04.597 --> 00:04:11.975
- the line last week because I hugged Brad Fulton, our guest speaker, at the end of his talk. So the net

00:04:11.975 --> 00:04:18.494
- result is it has been determined that we need to be observed. Maybe I need to be observed.

00:04:19.970 --> 00:04:27.659
- So anyway, Shilpa there in the back monitoring my behavior. But don't be concerned about this. Seriously,

00:04:27.659 --> 00:04:34.912
- as we heard last week, they have observed countless organizations, including the Bloomington Rotary

00:04:34.912 --> 00:04:42.383
- Foundation. And at a board level of the club, we talked about it a couple of months ago and said, yes,

00:04:42.383 --> 00:04:49.854
- we're willing to do this. So anyway, Shilpa or one of our colleagues will be here at Tuesday meetings.

00:04:49.954 --> 00:04:57.087
- They'll probably sit down on some kid meeting meetings. They'll probably be at board meetings. So now

00:04:57.087 --> 00:05:04.500
- that we've welcomed her, forget that she's here, and business as usual. Tyler, if I can get you to stand.

00:05:04.500 --> 00:05:11.983
- So Tyler Martin Nichols is our Zoom and audio producer. And Tyler is still looking for the right full-time

00:05:11.983 --> 00:05:19.326
- employment. Main areas of interest are arts administration, student affairs, program project management,

00:05:19.874 --> 00:05:28.908
- Faculty support, university governance. Tyler has master's degrees from both O'Neill and Jacobs. If

00:05:28.908 --> 00:05:38.755
- you have an interest in learning more, see Tyler or see me. I'll vouch for Tyler. He's personable, reliable,

00:05:38.755 --> 00:05:44.446
- very quiet way, very competent. He'd do a good job for anyone.

00:05:45.250 --> 00:05:53.277
- Last thing, it's been brought to my attention that several of us have received spam emails from people

00:05:53.277 --> 00:06:01.304
- we know in Rotary, special party invitation. So if you get that, just be very cautious. Let's see. Has

00:06:01.304 --> 00:06:09.332
- Jeff Richardson walked in yet? Yes. Jeff, are you ready for your reflection? All right. Jeff will give

00:06:09.332 --> 00:06:11.358
- our reflection for today.

00:06:26.402 --> 00:06:34.353
- Good afternoon. Reflecting on Lee Hamilton's life reminded me of the Jimmy Stewart character in Mr.

00:06:34.353 --> 00:06:42.463
- Smith Goes to Washington, a man of intellect, integrity, eagerness, kindness, tenacity, and humility.

00:06:42.463 --> 00:06:50.415
- And he never lost that competitive edge he had from being a top basketball player in Evansville and

00:06:50.415 --> 00:06:55.742
- at DePaul. How fortunate we were to have Lee in Congress from 1965

00:06:56.002 --> 00:07:03.764
- until 1999, then at the Wilson Center, a nonpartisan think tank in DC, and the Center for Congress at

00:07:03.764 --> 00:07:11.754
- IU, coupled with his teaching at the Hamilton Lugar School of Global and International Studies. He never

00:07:11.754 --> 00:07:19.440
- stopped serving the public's interest. In fact, his son reported he took his father to his IU office

00:07:19.440 --> 00:07:24.158
- the day before he died. First, Leonard reminded me of beloved

00:07:24.674 --> 00:07:33.236
- Charlotte, who was from her bed, still doing lobbying the day before she died. Before I share some personal

00:07:33.236 --> 00:07:41.243
- reflections, let's visit how just a few national media and local media characterize Lee's passing. A

00:07:41.243 --> 00:07:49.805
- life of service and global leadership, the Midwestern congressman from Washington who should be remembered,

00:07:49.805 --> 00:07:53.214
- a compromiser who operates above the fray,

00:07:53.890 --> 00:08:00.733
- powering figure in national politics with ties to IU, Lee Hamilton, a legacy of honor. Beyond these

00:08:00.733 --> 00:08:07.576
- and many other headlines, I also connected with some old friends and reminisced about Lee. Speaking

00:08:07.576 --> 00:08:14.761
- with Wayne and Emily Vance is always entertaining and informative. Wayne worked for Lee both in Congress

00:08:14.761 --> 00:08:22.494
- and the Center for Congress for over 40 years. Wayne stated that Lee's passion for public service never wavered.

00:08:22.658 --> 00:08:29.731
- As much as Lee enjoyed his work, he loved coming back to southern Indiana and meeting with his constituents.

00:08:29.731 --> 00:08:36.414
- He would return to Indiana 41 weekends every year. Wayne explained that Lee felt it kept him grounded,

00:08:36.414 --> 00:08:43.097
- and his outreach was not limited to folks back home. For example, he went to see the troops in Vietnam

00:08:43.097 --> 00:08:48.158
- in the mid-1960s and to Iraq during the war there to hear directly from them.

00:08:49.186 --> 00:08:56.491
- And of course, he traveled the world, meeting world leaders, always listening and learning. Wayne added

00:08:56.491 --> 00:09:03.795
- that Lee believed that his greatest legislative achievement was a passage of Medicare. As a 75-year-old

00:09:03.795 --> 00:09:11.029
- recipient, thank you, Lee. Lee also counted his tenure on the 9-11 Commission as a ringing success and

00:09:11.029 --> 00:09:18.334
- top-life achievement post-Congress. One pundit characterized Lee as a quiet man who made himself heard.

00:09:19.074 --> 00:09:26.704
- So true. I had several specific memories about Lee during my life, but due to time, we'll share just

00:09:26.704 --> 00:09:34.409
- a few. When campaigning as an IU student volunteer for various Democrats in 1970, that's when I first

00:09:34.409 --> 00:09:42.266
- heard Lee speak, very impressed. After Frank McCloskey got elected in 1971, Lee would drop by City Hall

00:09:42.266 --> 00:09:48.158
- to see the mayor, but also make a point of walking the halls to visit others.

00:09:48.418 --> 00:09:55.014
- 1970s, when I was on the city council, he would stop by, always asking how things were going, how are

00:09:55.014 --> 00:10:01.675
- we doing in Washington, and what should we be doing? Of course, he was a regular at all the Democratic

00:10:01.675 --> 00:10:08.141
- events in southern Indiana, and along with Birch Bayh, who incidentally, helped get elected in 1962

00:10:08.141 --> 00:10:14.867
- as a chair of the Bartholomew Democrats, he was a crowd favorite. Bayh once... This is Birch Bayh, once

00:10:14.867 --> 00:10:15.966
- famous, he said,

00:10:16.066 --> 00:10:21.846
- If I could ask God, one favorite would be to have a whole bunch of more Lee Hamiltons, the

00:10:21.846 --> 00:10:28.388
- perfect congressman. When Frank McCloskey was elected to Congress in 1982, I was honored, no, actually

00:10:28.388 --> 00:10:34.994
- thrilled to be part of Frank's team to get his new office off the ground. During my three months there,

00:10:34.994 --> 00:10:41.155
- the most regular visitor was Lee, sometimes a few times a day. Always upbeat, joyful, connected,

00:10:41.155 --> 00:10:46.046
- and eager. He would ask staff how they were doing and if we needed anything.

00:10:46.306 --> 00:10:53.757
- He knew no strangers. Fast forward to the early 1990s in Indianapolis. I was working for then-Governor

00:10:53.757 --> 00:11:01.064
- Evan By and the State House. One afternoon, there was a lunch panel with Lee Hamilton and Dick Lugar

00:11:01.064 --> 00:11:08.298
- about world affairs. It was breathtaking. Each spoke about meetings they had had with world leaders

00:11:08.298 --> 00:11:14.302
- and named them in the topic and did so with ease and not a hint of self-promotion.

00:11:14.978 --> 00:11:21.908
- So Lugar would say when I was meeting, and some of it was reflecting back, some of it was current, when

00:11:21.908 --> 00:11:29.104
- I was meeting with Gorbachev last week, I don't know, whatever, and then not to tip for a tap, but Hamilton

00:11:29.104 --> 00:11:35.768
- would say, yes, and I remember that when I was talking with Prime Minister Thatcher. And I just sat

00:11:35.768 --> 00:11:39.166
- there and I said, these two people, and they're so

00:11:39.298 --> 00:11:47.106
- not pretentious, not self-promoters, and talking about world affairs, and here they are, both from Indiana.

00:11:47.106 --> 00:11:54.768
- I mean, I was actually moved. I still am, that these two giants were from Indiana and were so compatible.

00:11:54.768 --> 00:12:02.504
- They sometimes would finish each other's sentences, and they were all civil, gracious, and kind, real-life

00:12:02.504 --> 00:12:06.046
- role models and real giants on the global stage.

00:12:07.170 --> 00:12:13.959
- It wasn't until very recently that I learned that Lee and Dick had been friends since 1967 when Dick

00:12:13.959 --> 00:12:21.150
- was running for mayor. As one pundit would wisely say, trust, respect, and integrity make change possible.

00:12:21.150 --> 00:12:28.141
- This honesty and wisdom was also the basis for Lee and Dick earlier receiving the Medal of Freedom from

00:12:28.141 --> 00:12:35.198
- President Obama. Lee's final column, just published a week ago, entitled Congress Needs More Friendships

00:12:35.618 --> 00:12:42.631
- Why, Lee asked, to reestablish Congress's ability to assert itself as a robust and effective branch

00:12:42.631 --> 00:12:49.784
- of government. He continued, you need to develop relationships on trust to do more than just put your

00:12:49.784 --> 00:12:57.077
- name on a bill. He concluded, members will have to join together in friendships and transcend the usual

00:12:57.077 --> 00:13:03.038
- workday. It's all about friendships, about service above self. And I thank you, Lee.

00:13:03.362 --> 00:13:17.823
- for a lifetime of public service, you will be deeply missed. Thank you, Jeff. Well done. So celebration

00:13:17.823 --> 00:13:33.118
- of service. You may remember that just a few weeks ago, we finished our district grant project for this year.

00:13:33.346 --> 00:13:41.251
- meals on wheels. We had two phases. And then the cycle begins again. And I want to thank for next year's

00:13:41.251 --> 00:13:49.080
- district grant in order to qualify, each club has to have two Rotarians participate in a one hour phone

00:13:49.080 --> 00:13:56.985
- call in the month of January. And Michelle Cohen and Sarah Laughlin did that for our club. I think Sarah

00:13:56.985 --> 00:14:02.782
- has been on as many calls as anyone. She can probably recite it from memory.

00:14:03.010 --> 00:14:14.543
- but it's one of the requirements. And in order to do it, you become eligible for a $6,000 grant.

00:14:14.543 --> 00:14:25.958
- So thank you, Michelle and Sarah. And now I'd like to ask Jeremy Graham to come up and join me,

00:14:25.958 --> 00:14:30.238
- if you would. Stand on either side.

00:14:33.026 --> 00:14:39.797
- So a native of Gary, Indiana, Jeremy Graham has been a Bloomington resident for almost 14 years. Jeremy

00:14:39.797 --> 00:14:46.308
- and his wife, Anisha, were high school sweethearts and have been together as a couple for 16 years.

00:14:46.308 --> 00:14:53.080
- They have a six-year-old son, and they're expecting a newborn son any day now, which is why I suggested

00:14:53.080 --> 00:14:59.786
- to Jeremy that you can go ahead and leave your phone on. And if all of a sudden I see you dart out the

00:14:59.786 --> 00:15:00.958
- door, all's good.

00:15:01.282 --> 00:15:07.978
- Jeremy is a full-time realtor here in Bloomington. He's the team lead for the Century 21 Sheets Graham

00:15:07.978 --> 00:15:14.479
- team. Anisha works with Jeremy. The two of them love to collaboratively guide, navigate, and assist

00:15:14.479 --> 00:15:20.980
- buyers, sellers, and investors through real estate transactions. In the community, Jeremy serves as

00:15:20.980 --> 00:15:27.677
- the membership vice chair of the NAACP Monroe County branch, serves on the welcoming committee for the

00:15:27.677 --> 00:15:29.822
- Bloomington Chamber of Commerce,

00:15:30.434 --> 00:15:38.885
- And he's also a shuttle bus driver for City Church, where he transports IU students to and from church

00:15:38.885 --> 00:15:47.336
- services. For fun, Jeremy enjoys spending quality time with his family, golfing with friends, building

00:15:47.336 --> 00:15:55.787
- relationships with people. Jeremy joins our Rotary Club as an NAACP organizational member, joining Jim

00:15:55.787 --> 00:16:00.382
- Sims, Jimmy Torrey, and Patrick Smith. Welcome, Jeremy.

00:16:00.738 --> 00:16:17.948
- Okay, we have time for an abbreviated membership section today. So today's quiz, we have lots of Rotarians

00:16:17.948 --> 00:16:30.494
- that exercise in different ways. Many of them work out at the Southeast YMCA.

00:16:31.458 --> 00:16:40.060
- But here's a list of four Rotarians. And one of these Rotarians does not exercise at the Bloomington

00:16:40.060 --> 00:16:48.662
- Southeast YMCA. And your choices are Rex Hillary, David Wright, Tracy Yovanovich, Steve Engel. So if

00:16:48.662 --> 00:16:57.349
- you think Rex Hillary is a Rotarian who does not work out at the Bloomington Southeast Y, put up your

00:16:57.349 --> 00:17:01.182
- hand and do the same online. Boy, no takers.

00:17:02.082 --> 00:17:10.633
- All right, how about David Wright? If you think David Wright is the one. All right, we have a few. Tracy

00:17:10.633 --> 00:17:19.022
- Ivanovich, and I will qualify, Tracy's on a cruise now, so we're talking about when she's home. If you

00:17:19.022 --> 00:17:27.736
- think Tracy is the one who does not work out at the Southeast Y, put up your hand. We have a few. Finally,

00:17:27.736 --> 00:17:29.854
- if you think Steve Engel,

00:17:29.954 --> 00:17:38.956
- is the one who does not work out. And I see one. All right. Well, if you guessed Rex Hillary,

00:17:38.956 --> 00:17:48.534
- you are wrong. Rex participates regularly in YMCA classes. Here's a picture of Rex from a few years

00:17:48.534 --> 00:17:58.398
- ago. Rex is front left. And on the background, you can also see Mike Wade and Gus Juskalis. All right.

00:17:59.874 --> 00:18:07.469
- If you guessed David right, you would be right. David enjoys hiking and biking, but he lives

00:18:07.469 --> 00:18:16.452
- near Martinsville. So it'd be a long way for him to work out at Southeast Y. David has done some reflections.

00:18:16.452 --> 00:18:24.781
- He also, we had a club session with David as a speaker. He is our club puppeteer. Tracy Ivanovich. If

00:18:24.781 --> 00:18:27.966
- you guessed Tracy, you would be wrong.

00:18:28.226 --> 00:18:35.727
- Tracy has been a runner for many years. Now, she is very much a walker. If any of you have ever tried

00:18:35.727 --> 00:18:43.449
- to keep up with Tracy as she goes around the wide track, good luck with that. Here's a picture, a couple

00:18:43.449 --> 00:18:51.171
- of us here on July 1st. Tracy is the more attractive individual on the left. And finally, if you guessed

00:18:51.171 --> 00:18:53.598
- Steve Engel, you would be wrong.

00:18:53.890 --> 00:19:02.426
- Steve is a regular weightlifter at the Y. So he's there exercising on a regular basis. So we did pretty

00:19:02.426 --> 00:19:10.880
- well. I would say for those who voted correctly picked David Wright. Rotary International, seven areas

00:19:10.880 --> 00:19:19.498
- of focus. And February is Peacebuilding and Conflict Prevention Month. We're going to skip Happy Dollars

00:19:19.498 --> 00:19:22.206
- today, but we have a short rodeo

00:19:22.402 --> 00:19:47.173
- rotary video to show. It's very difficult to even define peace. It goes beyond, of course, the absence

00:19:47.173 --> 00:19:51.742
- of war. It's about

00:19:51.842 --> 00:20:04.471
- social equity. It's about living and feeling respected where you live. And it's about showing this respect

00:20:04.471 --> 00:20:16.274
- to every person. Respect to every person. As a humanitarian organization, peace is a cornerstone of

00:20:16.274 --> 00:20:18.398
- Rotary's mission.

00:20:18.594 --> 00:20:26.891
- We believe when people work to create peace in their communities, that change can have a global effect.

00:20:26.891 --> 00:20:35.187
- Our Peace Center alumni, our dedicated peace builders, are leading the charge for that change. In Peace

00:20:35.187 --> 00:20:43.404
- Fellowship, you feel support of a very powerful organization, which is Rotary. The organization of the

00:20:43.404 --> 00:20:48.350
- program is unique because it is not purely academic. You meet

00:20:48.706 --> 00:20:56.501
- a lot of people coming from different parts of the world. So every new person brings you new perspectives,

00:20:56.501 --> 00:21:04.150
- new ideas. They come from those conflict areas or they are interested in working in those conflict areas

00:21:04.150 --> 00:21:11.580
- to minimize the effect of conflict. Their long-term role can be also in building mutual understanding

00:21:11.580 --> 00:21:17.918
- between the conflicting communities to make sure that future conflicts will not arise.

00:21:18.850 --> 00:21:27.594
- Peace building is really a kind of work. Mediate, transform conflict, stop conflict, create an environment

00:21:27.594 --> 00:21:36.175
- of peace, to connect people who made a choice to create these environments of peace. Try to bring people

00:21:36.175 --> 00:21:44.510
- from across the divide together in a way that they can see that the other doesn't have an enemy face.

00:21:44.802 --> 00:21:52.226
- bringing people together across everything and to do whatever it is that they're interested in. People

00:21:52.226 --> 00:21:59.505
- can come together to play board games and people can come together to learn how to tango. This means

00:21:59.505 --> 00:22:03.902
- that peace of course, ideally speaking, is a sort of desire.

00:22:04.226 --> 00:22:11.130
- but it's also a strong need we have in this world in order to offer people opportunity to thrive and

00:22:11.130 --> 00:22:18.034
- have an environment where human potential could flourish. There will always be times of conflict and

00:22:18.034 --> 00:22:25.143
- that's where the peace builders' role exists. They will try always to increase the mutual understanding

00:22:25.143 --> 00:22:28.766
- to make sure that those conflicts will not escalate.

00:22:28.898 --> 00:22:36.506
- So there is a strength that is coming from their commitment and dedication, their competencies and skills.

00:22:36.506 --> 00:22:43.758
- And this perhaps is going to fight against violence and fight against situations where peace is truly

00:22:43.758 --> 00:22:50.868
- missing. If people stop believing that it's possible, then it cannot happen. One of my worst quotes

00:22:50.868 --> 00:22:58.334
- is, if you want to have peace, you should prepare for war. I hate that quote. If you want to have peace,

00:22:58.690 --> 00:23:12.995
- you should prepare for peace. Alan Barker will introduce our speaker. Thank you very much, Steve. Hi,

00:23:12.995 --> 00:23:25.758
- everybody. It's great to see you. And that was an amazing video. It really was incredible.

00:23:25.858 --> 00:23:32.167
- And within the theme of peace, our talk today will illuminate a completely different area. And so I'm

00:23:32.167 --> 00:23:38.352
- delighted to introduce our very own Rotary member as guest speaker, Professor Scott J. Shackelford.

00:23:38.352 --> 00:23:44.599
- And I'm going to say a few things about him. This amazing individual who you probably all know about

00:23:44.599 --> 00:23:49.918
- is a Provost Professor of Business Law and Ethics at the IU Kelly School of Business.

00:23:50.594 --> 00:23:57.561
- He's also the director of the Center for Applied Cybersecurity Research and helps lead the Ostrom workshops,

00:23:57.561 --> 00:24:04.401
- work on cybersecurity and internet governance. So he has a number of different roles at Indiana University

00:24:04.401 --> 00:24:11.049
- and makes a huge impact. In short, he spends his time thinking about and helping us shape how societies

00:24:11.049 --> 00:24:17.697
- can navigate the risks and opportunities of our increasingly digital world. Scott is one of the authors

00:24:17.697 --> 00:24:20.062
- of Securing Democracies in an Age of

00:24:20.162 --> 00:24:26.661
- Instability, a timely and important book that examines how democratic institutions are being challenged

00:24:26.661 --> 00:24:32.910
- by cyber security, threats, disinformation, and rapid technological change. So the title of Scott's

00:24:32.910 --> 00:24:39.472
- talk today, Securing Democracies in the Digital Age, could not be more relevant and present in our world

00:24:39.472 --> 00:24:40.222
- these days.

00:24:40.898 --> 00:24:46.684
- At a moment marked by political polarization, geopolitical tensions, rapid advances in artificial

00:24:46.684 --> 00:24:52.824
- intelligence, and boy, do I know about that this morning, I had a bunch of conversations that have sort

00:24:52.824 --> 00:24:59.201
- of rocked me back on my heels, and widespread concerns about misinformation and cybersecurity interference,

00:24:59.201 --> 00:25:05.282
- questions about how democracies can remain resilient and trustworthy feel especially urgent. So, we're

00:25:05.282 --> 00:25:07.998
- very fortunate to have one of our own, Scott,

00:25:08.162 --> 00:25:15.789
- To help us make sense of these challenges, please join me in offering a warm rotary welcome to one of

00:25:15.789 --> 00:25:23.492
- our own, Professor Scott Shackelford. Well, thank you so much. Good afternoon, everybody. How are you?

00:25:23.492 --> 00:25:30.970
- I have to say we timed this spectacularly, okay? So today, if you didn't, if you weren't aware, now

00:25:30.970 --> 00:25:33.886
- you will be, it is Safer Internet Day.

00:25:34.594 --> 00:25:39.848
- I know, I know. Every day, ideally, should be Safe for Internet Day, but it's actually today. So a lot

00:25:39.848 --> 00:25:45.256
- of reasons to, I think, have this timely discussion right now. And actually, as part of Safe for Internet

00:25:45.256 --> 00:25:50.510
- Day, I was giving a keynote for a conference in New Delhi earlier today. And you know the title of the

00:25:50.510 --> 00:25:55.917
- conference? The Cyber Peace Summit, the Global Cyber Peace Summit. They had more than 70 countries there.

00:25:55.917 --> 00:26:01.120
- So a little bit about what I'll be talking to you about today is very much a global movement. So it's

00:26:01.120 --> 00:26:03.518
- not all going to be doom and gloom, I promise.

00:26:03.682 --> 00:26:09.674
- It's a beautiful day outside. We'll try to let some of that light shine on this topic as well. Thanks

00:26:09.674 --> 00:26:15.608
- so much again for the opportunity and for that very warm welcome. It's great to be back. I wanted to

00:26:15.608 --> 00:26:21.482
- just do a few things upfront per usual. There's too much content to get through, so I'd rather have

00:26:21.482 --> 00:26:28.062
- time to get through discussions about especially a topic on all of our minds is this, but a few things upfront.

00:26:28.194 --> 00:26:33.289
- One, if you haven't been over to the Ostrom workshop in a little while, come check us out. There's a

00:26:33.289 --> 00:26:38.334
- lot of exciting stuff going on, including with regards, Alan, as you were saying, to AI governance.

00:26:38.434 --> 00:26:43.706
- tech governance, as well as, of course, environmental challenges, you name it. We just did a joint session

00:26:43.706 --> 00:26:48.978
- with the Environmental Resilience Institute this last Friday on the environmental impacts of data centers,

00:26:48.978 --> 00:26:53.906
- for example, here in Indiana and around the world. Really awesome talks every Monday and Wednesday,

00:26:53.906 --> 00:26:58.833
- a lot of ways to get engaged, and we still have copies of that children's book, which I should have

00:26:58.833 --> 00:26:59.966
- brought with me today.

00:27:00.066 --> 00:27:04.577
- But if anybody wants to follow me over, we still have Lynn's on Common Life, which actually they're

00:27:04.577 --> 00:27:09.088
- doing a teacher training workshop on that in a couple weeks here for civics classes and things like

00:27:09.088 --> 00:27:11.614
- that. So her work is getting out there, which is great.

00:27:12.482 --> 00:27:17.642
- The Center for Applied Cybersecurity Research still does a lot of work on election security, but also

00:27:17.642 --> 00:27:22.953
- more broadly looking at local critical infrastructure protection. Real hot topic these days, which could

00:27:22.953 --> 00:27:28.265
- be a separate talk, is water. Water utilities in particular are a very unfortunate soft target. So we're

00:27:28.265 --> 00:27:33.475
- working with a lot of those around the state and beyond to see how we can come together and build some

00:27:33.475 --> 00:27:34.942
- resilience in those systems.

00:27:36.418 --> 00:27:42.213
- Okay, and here's some of the work on Cyber Peace. Bit of a trilogy, I'm still waiting on the movie deal.

00:27:42.213 --> 00:27:47.787
- Matt Damon, I mean, come on, you're done with the Odyssey, this is the natural next step. So I'll be

00:27:47.787 --> 00:27:53.306
- peppering in a few insights from that work on Cyber Peace, but again, the focus is gonna be on this

00:27:53.306 --> 00:27:58.383
- brand new book, and it just came out last month on securing democracies in the digital age.

00:27:58.383 --> 00:28:04.068
- And that book, I'm happy to say, it's published with Cambridge Press, but it's open access. So anybody

00:28:04.068 --> 00:28:06.110
- around the world can freely download

00:28:06.178 --> 00:28:12.067
- access the contents, no paywalls, nothing like that. I wouldn't be really preaching the gospel of the

00:28:12.067 --> 00:28:18.359
- commons if it was any other way, but I still wanted to flag that, right? And kudos upfront to my co-editors,

00:28:18.359 --> 00:28:24.248
- Frédéric Doucet from Paris 8 and Chris Ankersen from NYU. There was a slate of amazing scholars and

00:28:24.248 --> 00:28:30.367
- practitioners all around the world that were involved with this project. The focus of the book, if you're

00:28:30.367 --> 00:28:35.390
- interested, was basically three geopolitical hotspots. So we looked at Eastern Europe,

00:28:35.554 --> 00:28:42.124
- We looked at Middle East, North Africa, and we looked at East Asia and South Asia to see how these advancing

00:28:42.124 --> 00:28:48.333
- and emerging democracies are dealing with a whole array of challenges, as you just heard in the intro,

00:28:48.333 --> 00:28:54.541
- right? So threats to election infrastructure, how we manage misinformation, disinformation, deepfakes.

00:28:54.541 --> 00:29:00.870
- And a lot of the US focus was on how we, as in the US, are managing those threats from abroad. We didn't

00:29:00.870 --> 00:29:02.558
- focus on homegrown threats.

00:29:02.722 --> 00:29:08.523
- We can take this in lots of directions. I grant you there's rabbit holes upon rabbit holes that we can

00:29:08.523 --> 00:29:13.705
- go down, but for purposes of the book and my presentation, just know that it's more of that

00:29:13.705 --> 00:29:19.337
- global perspective. Okay, so very briefly, when we approach this topic of protecting our elections,

00:29:19.337 --> 00:29:24.350
- it's good to think about it first at the 30,000 foot level, then we'll drill down, okay?

00:29:24.482 --> 00:29:30.672
- Critical infrastructure, what the heck is that? Here in the US, we have 16 critical infrastructure sectors,

00:29:30.672 --> 00:29:36.575
- and you see them all labeled up there. They run the gambit. Think about everything that's really vital

00:29:36.575 --> 00:29:42.593
- that if it goes down, you're having a really bad day, right? So healthcare, telecom, right? Agriculture,

00:29:42.593 --> 00:29:48.553
- okay? Elections is one of those systems. And the US, there's only a few countries around the world that

00:29:48.553 --> 00:29:53.310
- specify that, but we're one of them actually, and that's been the case since 2017.

00:29:53.602 --> 00:29:59.643
- So that means that organizations historically, like DHS, have a bigger role in safeguarding election

00:29:59.643 --> 00:30:05.982
- infrastructure and systems. I just wanted to mention that up front. This has been a long-running problem.

00:30:05.982 --> 00:30:12.382
- We see that in the elections context and more broadly. There have been groups that have tried to undermine

00:30:12.546 --> 00:30:17.814
- election outcomes and frankly just confidence in elections dating way back, believe it or not, if you

00:30:17.814 --> 00:30:23.134
- guys remember, the South African elections that actually elected Nelson Mandela in the early 90s. That

00:30:23.134 --> 00:30:28.350
- was one of the first documented attempts of using some early internet worms to interfere with voting

00:30:28.350 --> 00:30:33.566
- machines, right? So that was a little while ago now, more than 30 years. Of course, as was mentioned

00:30:33.566 --> 00:30:36.510
- these days, the technologies, the techniques have really

00:30:36.674 --> 00:30:41.553
- advanced quite a bit in some scary ways, and we're all racing to catch up. So again, I'll be focusing

00:30:41.553 --> 00:30:46.385
- mostly on some of that stuff. There's been a variety of attacks on these different types of critical

00:30:46.385 --> 00:30:51.552
- infrastructure over the years. We can talk more about it, but suffice it to say, everybody's been breached,

00:30:51.552 --> 00:30:56.335
- right? And there's this old joke that if you weren't, if you think you haven't been breached, well,

00:30:56.335 --> 00:31:01.167
- you just haven't found out yet, right? I mean, why do we have to tell us here in Monroe County even,

00:31:01.167 --> 00:31:06.238
- right? Like it happens. It happens with some regularity. And of course, IU is no stranger to that either,

00:31:06.690 --> 00:31:13.133
- Globally, there are some big trends that are feeding this, right? So there's a lot of geopolitics lurking

00:31:13.133 --> 00:31:19.271
- in the background, as always. Unfortunately, Ukraine has been a testbed for both information warfare

00:31:19.271 --> 00:31:25.410
- and cyber warfare since their elections in 2013, right? So we've seen that, of course, most recently

00:31:25.410 --> 00:31:27.902
- in the international armed conflict, but

00:31:28.034 --> 00:31:33.860
- They've also pioneered, as I'll talk about in just a minute, some responses for how you get ahead of

00:31:33.860 --> 00:31:39.628
- disinformation and deep fakes. They basically have a version of the PBS NewsHour, where every night

00:31:39.628 --> 00:31:45.627
- they're debunking the kind of fake news or the deep fake story of the day in real time, right? And it's

00:31:45.627 --> 00:31:47.934
- different when you think about Beijing.

00:31:48.194 --> 00:31:53.942
- Okay, so in the book, we try to do this with some granularity, figuring out how different adversaries

00:31:53.942 --> 00:31:59.747
- around the world who are commonly trying to undermine trust collectively in the democratic process are

00:31:59.747 --> 00:32:04.030
- approaching these, right? You see there that Russia versus China's approach

00:32:04.162 --> 00:32:09.601
- could not be more different. They're both at opposite ends of this spectrum, right? One is trying to

00:32:09.601 --> 00:32:15.255
- sow chaos, confusion, right? The other is really trying to plant some seeds to undermine long-term trust

00:32:15.255 --> 00:32:20.640
- in the stability of these systems, such as by sowing divisions, but also more broadly. And I'll say

00:32:20.640 --> 00:32:26.295
- we're not unique in the U.S. in that regard. These campaigns have been playing out all around the world.

00:32:26.295 --> 00:32:31.841
- I promise it's not all doom and gloom. We're gonna get there. There are a lot of good stuff, good work

00:32:31.841 --> 00:32:33.726
- that's being done here in Indiana.

00:32:33.858 --> 00:32:39.408
- and more broadly to get a better handle on some of these challenges. Some of them are just frankly voting

00:32:39.408 --> 00:32:44.697
- machines, right? Like here in Monroe County, how do we vote? You better remember that usually it's a

00:32:44.697 --> 00:32:49.985
- touch screen and there's like a little, or you have a little optical scan paper ballot that you scan

00:32:49.985 --> 00:32:55.274
- in, that you scan in, right? Depending on the place. Those are gonna be phased out, but anybody know

00:32:55.274 --> 00:33:00.510
- when they're gonna be phased out by? The Indiana legislature passed this law a couple of years ago.

00:33:00.510 --> 00:33:01.662
- 2027 is our deadline.

00:33:01.826 --> 00:33:07.042
- All right. So not super forward leaning. You'll see a map in a second that's going to give you some

00:33:07.042 --> 00:33:12.414
- more details on that. Other states, you know, a bit further along and a part that's driven by the fact

00:33:12.414 --> 00:33:17.682
- that there is problems as you see here in the software ecosystem supporting those different machines

00:33:17.682 --> 00:33:23.315
- and just more broadly to the supply chain. We've been doing a lot of work on how you rethink accountability

00:33:23.315 --> 00:33:28.688
- for software. So if you're Microsoft, there's something called Patch Tuesday, right? You put out a new

00:33:28.688 --> 00:33:29.470
- product today,

00:33:29.602 --> 00:33:35.003
- If it causes a problem, you patch it tomorrow. And who's responsible for all the damage that happens

00:33:35.003 --> 00:33:40.617
- in the meantime? Maybe you remember the CrowdStrike incident a couple years ago, where there was a patch

00:33:40.617 --> 00:33:46.124
- that went out, caused a world of hurt if you were flying that day, but also more broadly, the NHS, for

00:33:46.124 --> 00:33:51.579
- example, systems went down. But the question has always been lurking is, well, why aren't we treating

00:33:51.579 --> 00:33:57.086
- tech companies liable for all that? We're treating them with kid gloves. Just as of last month, Europe

00:33:57.186 --> 00:34:02.788
- has extended their product's liability directive to software. That's a fancy legal way for saying, if

00:34:02.788 --> 00:34:08.334
- there's a bug in code that leads to real world harm, now in Europe, you get to hold the tech company

00:34:08.334 --> 00:34:13.991
- liable, strict liability. I was involved with a software liability summit at the White House. This was

00:34:13.991 --> 00:34:19.647
- a while ago now, mid-24, where we were talking about the same thing, should the US follow suit or not.

00:34:19.647 --> 00:34:25.469
- That's a game change. We're starting to see that with things like social media for kids. We're not seeing

00:34:25.469 --> 00:34:26.622
- it more broadly yet.

00:34:26.914 --> 00:34:32.238
- But still, it's worth kind of keeping that broader scope in mind. And lastly, these are some of the

00:34:32.238 --> 00:34:37.828
- big trends feeding into all this. I'll say a bit more about AI, but honestly, you name it, each of these

00:34:37.828 --> 00:34:43.205
- is important for thinking about ways that we can secure democracies. Even internet access from space

00:34:43.205 --> 00:34:48.529
- is really taking off. Forgive the pun. We're actually launching a space cybersecurity digital badge

00:34:48.529 --> 00:34:53.374
- program later today here at IU, which is pretty cool. So there's a lot of neat work there.

00:34:53.794 --> 00:34:59.235
- And again, more broadly, there's ways to think about this from more of a peace perspective as a shared

00:34:59.235 --> 00:35:04.730
- responsibility. And there's different ways to approach that. Okay, given time, let's focus on democracy

00:35:04.730 --> 00:35:10.066
- here. So in the book, we're trying to look at this from a variety of different perspectives. There's

00:35:10.066 --> 00:35:15.613
- this grid that gives you a bit of a flavor for how we think about democratic resilience. Yes, it's about

00:35:15.613 --> 00:35:20.949
- protecting election infrastructure. So the voting machines, the tabulation systems that count up the

00:35:20.949 --> 00:35:23.326
- votes, but it's also a much broader problem.

00:35:23.714 --> 00:35:29.158
- If it was that easy, we would have gotten ahead of it a while ago. So this just kind of gives you a

00:35:29.158 --> 00:35:34.764
- flavor of that. You'll see a specific when we dig into the case study about what went down in 2016 and

00:35:34.764 --> 00:35:40.426
- thereafter. This was a fun one. I did a separate book, a history book, called Forks in the Digital Road

00:35:40.426 --> 00:35:46.141
- that looked back at the history of the internet and cybersecurity. There's been hacking going on a long,

00:35:46.141 --> 00:35:51.966
- long time, including, as you all know, hacking of elections, depending on how you like to define the term.

00:35:52.098 --> 00:35:58.382
- Some cities, some states are notorious for this kind of thing. Of course, what's different is the scale

00:35:58.382 --> 00:36:04.786
- that's possible by undermining the shared systems that all of this infrastructure relies upon. And that's

00:36:04.786 --> 00:36:11.070
- one of the things we're focusing on here. Like we said, here in Monroe County, mostly we're using these

00:36:11.070 --> 00:36:17.475
- optical scan paper ballots. Now, help me. We'll make it a bit more interactive because I know it's middle

00:36:17.475 --> 00:36:20.254
- of the afternoon. I need some coffee as well.

00:36:20.418 --> 00:36:27.395
- From a security perspective, what do we think? Which of these is going to be best? Should we go back

00:36:27.395 --> 00:36:34.302
- to those hanging chads? Oh man, those are the good old days. Great Halloween costumes, though. What

00:36:34.302 --> 00:36:41.625
- do you think? From a security perspective, you want to have as many layers of authentication as possible,

00:36:41.625 --> 00:36:47.358
- right? So you can double check the results in close selections. What do you think?

00:36:49.218 --> 00:36:54.530
- Okay, okay, okay. So in other words, if you have some kind of system where there can be a paper audit

00:36:54.530 --> 00:36:59.947
- after the fact, that can build confidence and ultimately trust in the democratic process, right? What's

00:36:59.947 --> 00:37:05.311
- the problem if you don't have the paper trail? Like a lot of jurisdictions still don't around Indiana.

00:37:05.311 --> 00:37:10.623
- Well, in a close election, or if there's intelligence after the fact that in some way, shape or form,

00:37:10.623 --> 00:37:16.248
- it was undermined, it's tough to go back forensically and double check those results. So the gold standard,

00:37:16.248 --> 00:37:17.758
- long story short these days,

00:37:18.370 --> 00:37:24.189
- is what's called a risk limiting audit. So you have a jurisdiction, I'll give you a flavor for what

00:37:24.189 --> 00:37:30.183
- this looks like nationwide. So you pick a state where they're all green. You institute a risk limiting

00:37:30.183 --> 00:37:36.177
- audit and you take a statistical sample of the votes cast and then you compare it against the reported

00:37:36.177 --> 00:37:41.182
- results just to make sure they're consistent. Colorado was the pioneer in doing that.

00:37:41.314 --> 00:37:46.853
- Then Rhode Island, we have a half a dozen states that have these laws on the books. Indiana's going

00:37:46.853 --> 00:37:52.558
- in that direction. There's some legislation around a risk-limiting audit, but again, we don't have the

00:37:52.558 --> 00:37:58.152
- underlying tech yet to really make it possible statewide. When you look at this map, it looks pretty

00:37:58.152 --> 00:38:03.968
- different from 2016. So some states have really gotten on the ball. So Georgia, back in 2016, was bright

00:38:03.968 --> 00:38:05.630
- red. All right. Pennsylvania.

00:38:06.050 --> 00:38:12.057
- bright red. In other words, some core swing states had some technology that was problematic. There have

00:38:12.057 --> 00:38:17.948
- been groups that were able to hack into some of these voting machines that were still running Windows

00:38:17.948 --> 00:38:23.840
- XP just to show what was possible. It's not great. That's not great. As a reminder, Windows XP hasn't

00:38:23.840 --> 00:38:29.731
- been updated since 2014. So nobody's really double checking that. So that's a problem. And we haven't

00:38:29.731 --> 00:38:34.814
- really invested in this stuff at a federal level since that 2002 Help America Vote Act.

00:38:35.010 --> 00:38:40.182
- That was, of course, in response to all the hanging chads. It was about a $4 billion investment that

00:38:40.182 --> 00:38:45.507
- went to the states, because, again, we don't have one national election, right? We have more than 3,000

00:38:45.507 --> 00:38:50.781
- local elections that get aggregated up. So it helped to purchase a lot of equipment. But back in 2002,

00:38:50.781 --> 00:38:56.157
- what was the main headache we were trying to solve? Those long lines of polling places and the confusion

00:38:56.157 --> 00:39:01.329
- around hanging chads, right? Cyber security wasn't exactly front of mind. So what did a lot of these

00:39:01.329 --> 00:39:02.814
- jurisdictions go out and do?

00:39:03.298 --> 00:39:09.426
- including in Georgia and Pennsylvania, well, they bought a lot of these handy systems where they're

00:39:09.426 --> 00:39:15.798
- just touchscreens, but without a paper trail, right? So it's that kind of technical debt that now we're

00:39:15.798 --> 00:39:21.987
- having to go back to and refresh, right? Here in the US, but also around the world. Okay, so looking

00:39:21.987 --> 00:39:28.237
- to the midterms this fall, but in 2024, you know, equally valid, this just kind of gives you a flavor

00:39:28.237 --> 00:39:32.894
- of all the different layers that the bad guys could focus on and try to sow

00:39:33.058 --> 00:39:39.482
- distrust, right, or try to manipulate some processes. So that goes all the way from voter information.

00:39:39.482 --> 00:39:45.781
- So their information we rely on platforms, right? So as you know, social media is flooded these days

00:39:45.781 --> 00:39:52.642
- with AI generated content. I mean, I think that's 80% at this point of X. I don't even know. It's ridiculous.

00:39:52.642 --> 00:39:54.014
- So what we can trust.

00:39:54.210 --> 00:40:00.422
- Who really said what? What video is legitimate? That's already become a real issue. Some states, including

00:40:00.422 --> 00:40:06.285
- Wisconsin, even Indiana, have some new laws on the books that actually attach civil penalties if you

00:40:06.285 --> 00:40:12.323
- spread disinformation or deep fakes involving candidates within a certain number of days of an election

00:40:12.323 --> 00:40:18.128
- cycle. So that's fine. That's one little aspect of this issue. Finding the perpetrators, et cetera,

00:40:18.128 --> 00:40:22.366
- is easier said than done. But that's only, obviously, one piece of this.

00:40:22.626 --> 00:40:28.253
- Again, other jurisdictions like the EU, they've gotten a lot further in this regard. There's some pretty

00:40:28.253 --> 00:40:33.719
- stiff penalties. Election rules. So these are basically where you're registered to vote. So those can

00:40:33.719 --> 00:40:39.239
- be manipulated as well. So picture like an elaborate Excel spreadsheet. If you can get access to that,

00:40:39.239 --> 00:40:44.062
- and unfortunately, there's no even state law still requiring us to encrypt all that data,

00:40:44.226 --> 00:40:49.092
- So there's still some precincts that have it unencrypted. You can imagine any kind of other

00:40:49.092 --> 00:40:54.593
- Excel spreadsheet, right? You can manipulate the cells. You can make it look like somebody's registered

00:40:54.593 --> 00:41:00.146
- to vote there when they're actually registered to vote there. That can lead to long lines, sow distrust,

00:41:00.146 --> 00:41:05.541
- and again, ultimately undermine confidence. No evidence that happened in 24, to be clear, but that is

00:41:05.541 --> 00:41:08.926
- the issue. And we know back in 2016, election rolls in Illinois

00:41:09.090 --> 00:41:14.213
- were compromised. Again, no evidence that anything was changed, but because so much of it is

00:41:14.213 --> 00:41:19.942
- still unencrypted, that's still an issue. Voting machines, we talked about. Tabulation systems are just

00:41:19.942 --> 00:41:25.616
- fancy software for adding up votes, all right? Like any software, that can be manipulated. No evidence

00:41:25.616 --> 00:41:31.180
- that happened in 24, but other democracies, like the Netherlands, have been so concerned that in one

00:41:31.180 --> 00:41:35.422
- of their last elections, they went back and hand counted every paper ballot.

00:41:36.450 --> 00:41:42.326
- They didn't have any trust in their tabulation systems because they had some intelligence that they

00:41:42.326 --> 00:41:48.319
- had been compromised. Dissemination of news is also a problem, right? So imagine what you could do if

00:41:48.319 --> 00:41:54.312
- you hacked into the AP Newswire on election day to make it look like a candidate or a party was ahead

00:41:54.312 --> 00:42:00.423
- or behind in the polls in a country like ours with multiple time zones. That could also impact turnout,

00:42:00.423 --> 00:42:05.182
- right? And also sow distrust. No evidence that's happened to the AP or otherwise

00:42:05.314 --> 00:42:10.466
- But that's still a potential headache. And we have seen that, unfortunately, in other countries like

00:42:10.466 --> 00:42:15.670
- Ukraine, in terms of news sites being targeted in that way, which gets us thinking about, well, heck,

00:42:15.670 --> 00:42:20.975
- is news critical infrastructure? That has other big First Amendment problems, right? If we do have more

00:42:20.975 --> 00:42:26.076
- of a government role there. Talked about critical infrastructure, too. So if you can snarl traffic,

00:42:26.076 --> 00:42:31.330
- if you can cause blackouts, you name it, not across the nation, but just in certain core swing cities,

00:42:31.330 --> 00:42:34.238
- certain core swing states, that can also be problematic.

00:42:34.498 --> 00:42:41.692
- right? So these broader issues can come back and lurk in this context as well. Okay, what are we doing?

00:42:41.692 --> 00:42:43.006
- We're doing a lot.

00:42:43.362 --> 00:42:49.592
- There have been some positive steps forward. One of them was in the information sharing network called

00:42:49.592 --> 00:42:55.761
- the LISAC. So basically it's an information sharing and analysis center. It allows election officials

00:42:55.761 --> 00:43:01.809
- to share information and learn from intelligence about threats in real time. We didn't have that in

00:43:01.809 --> 00:43:05.982
- 2016. Now the issue is right now DHS is in the process of unwinding.

00:43:06.082 --> 00:43:12.361
- So looking ahead to 2026 election a lot of local election officials won't have potentially as updated

00:43:12.361 --> 00:43:18.639
- information as they've grown accustomed to for the last seven years. There are some other shifts that

00:43:18.639 --> 00:43:24.856
- have played out again if we had more time we could talk about how some of the parties are responding

00:43:24.856 --> 00:43:27.134
- and how we're looking more globally.

00:43:27.394 --> 00:43:32.898
- Again, the good news is that we are working together more than has been the case in the past and that

00:43:32.898 --> 00:43:38.349
- the word is getting out, especially with a pretty wide array of democracies about some of these best

00:43:38.349 --> 00:43:44.285
- practices. So in the book, we dig into these in a little bit more detail. By and large, all these democracies

00:43:44.285 --> 00:43:46.174
- and plenty others around the world

00:43:46.274 --> 00:43:51.655
- are embracing a lot of these best practices. So a lot that had electronic voting machines are going

00:43:51.655 --> 00:43:57.197
- back to paper or involving paper at some point in the process. They're doing risk limiting audits. The

00:43:57.197 --> 00:44:02.686
- only real country that's going the opposite direction is Estonia. They still allow online voting, and

00:44:02.686 --> 00:44:08.120
- they have since the early 2000s, right? But keep in mind, this is a country pretty homogenous with a

00:44:08.120 --> 00:44:09.950
- population the size of San Diego.

00:44:10.210 --> 00:44:15.987
- So a bit of a different vibe than the United States in that regard. And they're different in other ways,

00:44:15.987 --> 00:44:21.598
- too. It's an incredible country, though, if you haven't been. All the others, right? Embracing paper.

00:44:23.586 --> 00:44:28.355
- Which just gets thinking about these other layers of the problem though, okay? So you might remember

00:44:28.355 --> 00:44:33.172
- this seems so antiquated at this point, but there was this whole like puffy jacket thing. This was an

00:44:33.172 --> 00:44:37.941
- early deep fake of the Pope that really went viral, right? And this was one of the first times like,

00:44:37.941 --> 00:44:42.238
- wow, I had no idea it's such fashion sense, right? People weren't aware of what all the...

00:44:42.434 --> 00:44:48.172
- technologies can do. If you're a kind of a relative newcomer to kind of AI and deepfakes, I think we

00:44:48.172 --> 00:44:53.853
- can send out the slides. I included some links to some really great podcasts like any Radiolab fans

00:44:53.853 --> 00:44:59.591
- out there. They went and they interviewed the original researchers from the University of Washington

00:44:59.591 --> 00:45:05.328
- who put out some of these tools. These days, it's incredibly easy to do. If none of you, let me just

00:45:05.328 --> 00:45:10.782
- ask, how many of you have tried to create an AI generated picture or video at some point? Okay.

00:45:11.330 --> 00:45:19.214
- I'd say at least a third, right? Bordering on half. What was that experience like? Was it pretty easy

00:45:19.214 --> 00:45:27.253
- to do? Yeah? Yeah, Alan, as well? Okay, okay. If you don't mind me asking, what tool did you use? Yeah.

00:45:27.253 --> 00:45:35.601
- Okay. Most all of the big AI platforms can do this now, right? So Jim and I, for example, Google's version.

00:45:35.601 --> 00:45:40.702
- Sora, which is like this basically slop on steroids from open AI.

00:45:40.834 --> 00:45:47.070
- I mean, you name it, it's incredibly easy. And it's frankly, it's just so easy and it requires basically

00:45:47.070 --> 00:45:53.129
- no technological sophistication at this point that the cat's completely out of the bag, right? So the

00:45:53.129 --> 00:45:59.127
- only real options are legislation to really hold tech platforms fully accountable, buck stops there,

00:45:59.127 --> 00:46:03.582
- you can't put it on the users, right? Or we have to think about, you know,

00:46:03.682 --> 00:46:09.662
- everything else that would be nice, but isn't going to change things, you know, full stop. It's going

00:46:09.662 --> 00:46:15.524
- to be a long-term evolution, basically digital citizenship training, right? That's important. We do

00:46:15.524 --> 00:46:21.504
- it in schools. We could do a better job. But frankly, there's not a lot of other fantasias out there.

00:46:21.504 --> 00:46:27.953
- Okay. Here in the US, we're also, you know, we have, I don't want to say handcuffs exactly, but we definitely

00:46:27.953 --> 00:46:32.350
- have our freedom of movement at the federal level limited because of this.

00:46:32.546 --> 00:46:37.968
- Right? So if you're not familiar, Section 230 of the Communications Decency Act, this is what gives

00:46:37.968 --> 00:46:43.553
- those shields to those tech platforms. OK? So if it wasn't for this law, and it was passed with really

00:46:43.553 --> 00:46:49.030
- good intentions, and frankly, it made a lot of sense in the 90s, because at that time, you could sue

00:46:49.030 --> 00:46:54.506
- the equivalent of Google or Meta or Facebook, you name it, because of the content they host on their

00:46:54.506 --> 00:47:00.146
- site. So if there was hate speech that you came across on Facebook, you could sue Facebook. Right? This

00:47:00.146 --> 00:47:02.206
- was the law saying you can't do that.

00:47:02.594 --> 00:47:07.964
- All right, simply put. And because of that, that's really changed the incentives for these companies

00:47:07.964 --> 00:47:13.280
- to not really moderate as much as they could. The issue is everybody agrees they don't like section

00:47:13.280 --> 00:47:18.650
- 230 anymore, but nobody can come together and come up with a consensus about what do we do about it?

00:47:18.650 --> 00:47:24.126
- All right. Like Biden, for example, and Trump agree on almost nothing. They both hated this. Okay. But

00:47:24.126 --> 00:47:29.708
- what comes next? If you take this down, you know, the incentives can go one extreme or the other. Either

00:47:29.708 --> 00:47:30.878
- it's super moderated,

00:47:31.010 --> 00:47:36.371
- sanitized, or digital hellscape, right? There's not a lot of movement in between the two. So that's

00:47:36.371 --> 00:47:42.001
- why we still have it, but it does limit a little bit what we can do at the federal level and as a result

00:47:42.001 --> 00:47:47.576
- at the state level. What are other countries doing? Well, a lot of experimentation. I mentioned Ukraine

00:47:47.576 --> 00:47:52.991
- before. In the UK, actually, they treat coding as a second language. All their kids are getting some

00:47:52.991 --> 00:47:54.814
- coding as they go through school.

00:47:55.010 --> 00:47:59.949
- And they're also doing digital citizen training, including spotting AI and deep fake content as part

00:47:59.949 --> 00:48:04.888
- of that. They can do that, of course, because it's the UK. There's a national curriculum. They don't

00:48:04.888 --> 00:48:10.071
- have the same setup that we have here in the States. But that doesn't mean we school districts, including

00:48:10.071 --> 00:48:15.059
- MCCSC and otherwise, couldn't learn from some of those experiences and help kind of raise the overall

00:48:15.059 --> 00:48:19.949
- level of cyber hygiene and awareness. There's also some major international initiatives. The Munich

00:48:19.949 --> 00:48:21.758
- AI Election Accords are one of them.

00:48:21.890 --> 00:48:27.193
- So all the major democracies, this was about a year and a half ago now, and all the tech platforms got

00:48:27.193 --> 00:48:32.341
- on board with that to limit the spread of disinformation, deep fakes, and violent extremism online.

00:48:32.341 --> 00:48:37.592
- But again, they're norms, right? So what happens when they're violated? This gives you a sense on the

00:48:37.592 --> 00:48:42.894
- AI side, what's happening at the regional level and the national level. I kept the US executive order,

00:48:42.894 --> 00:48:48.094
- even though it's been repealed, just to give you a flavor for what it would have done. So in the EU,

00:48:48.094 --> 00:48:49.278
- they have this kind of

00:48:49.378 --> 00:48:55.278
- three-tier structure for what's the most problematic uses of AI, all right? Elections is one of them,

00:48:55.278 --> 00:49:01.236
- all right? But there's some other things too, like spotting somebody's emotional state in public. That

00:49:01.236 --> 00:49:07.136
- was also banned, okay? Social credit scores of the kind they do in China, that was also banned in the

00:49:07.136 --> 00:49:13.499
- EU. Certain healthcare applications, certain autonomous driving applications are much more heavily regulated.

00:49:13.499 --> 00:49:18.878
- So they have this three-tier structure that we don't have, okay? In the US, what do we have?

00:49:19.106 --> 00:49:24.740
- Well, one, a lot of action at the state level. So California has really filled the gap here along with

00:49:24.740 --> 00:49:30.265
- a variety of other states, including with regards to some new prohibitions on what you can and can't

00:49:30.265 --> 00:49:36.064
- spread online and penalties for tech companies that violate some of those. And I can go into more details

00:49:36.064 --> 00:49:41.753
- if folks are curious. This is what the executive order would have done if it would have stayed in place

00:49:41.753 --> 00:49:47.934
- basically for these big large language models that have a national security implications for what they roll out.

00:49:48.162 --> 00:49:54.079
- They would have had to get basically a red team pre-approval first. But again, that's now gone by the

00:49:54.079 --> 00:49:59.996
- wayside. Okay. Important role for international law in all of this. If we had more time, you know, we

00:49:59.996 --> 00:50:01.214
- could talk about it.

00:50:01.346 --> 00:50:06.741
- But you can also just read the book, which is great. It's a good beach read for spring break. I promise

00:50:06.741 --> 00:50:12.084
- it's going to help you feel good. This gives you a sense for all the contributors where they're coming

00:50:12.084 --> 00:50:17.323
- from. As you can tell, lots of universities, lots of institutions all around the world. Andy Grotto,

00:50:17.323 --> 00:50:22.614
- who was in the White House, the Office of the National Cyber Advisor during the Obama administration,

00:50:22.614 --> 00:50:28.113
- he's now at Stanford, did a great piece looking at the history of US election security. I'd really hardly

00:50:28.113 --> 00:50:29.150
- recommend that one.

00:50:30.402 --> 00:50:35.638
- And ultimately, I'll just leave us with thinking about this. It's a whole of society approach, right?

00:50:35.638 --> 00:50:40.874
- As Len Ostrom famously said, there are no panaceas, and trust is the most important resource. There's

00:50:40.874 --> 00:50:46.059
- not a lot of trust to go around these days. I know, I know. But this is an area where I think we can

00:50:46.059 --> 00:50:51.244
- come together. I think the peace-building literature is so important in that regard. And think about

00:50:51.244 --> 00:50:53.246
- all the different layers here locally.

00:50:53.474 --> 00:50:58.801
- In working with a state, because even frankly, some things like protecting kids from social media, they're

00:50:58.801 --> 00:51:04.029
- pretty bipartisan these days, right? And when push comes to shove, a lot of the stuff that we're talking

00:51:04.029 --> 00:51:09.306
- about here is still pretty bipartisan too. So there's a lot more that we can do. And ideally, do together

00:51:09.306 --> 00:51:12.542
- so that we have democracies helping each other around the world.

00:51:12.674 --> 00:51:18.513
- And we've seen some green shoots in that regard. I saw some wonderful things happening in India earlier

00:51:18.513 --> 00:51:24.296
- today and across the global South. And I hope there's a time here in the US where we can more actively

00:51:24.296 --> 00:51:29.911
- contribute to that conversation at the federal level too. So again, I will pause there to make sure

00:51:29.911 --> 00:51:35.862
- that we have time for at least a couple questions or thoughts, but just thanks again for the opportunity.

00:51:35.862 --> 00:51:38.782
- And don't let any of this convince you not to vote.

00:51:38.882 --> 00:51:49.146
- All right, that's how foreign adversaries that are trying to undermine confidence, and even domestically,

00:51:49.146 --> 00:51:58.925
- that's how they win, right? So cast a vote, participate. That's the main takeaway. So yeah, happy to

00:51:58.925 --> 00:52:00.862
- take any questions.

00:52:02.434 --> 00:52:09.634
- So thank you, Scott. That was amazing. That was incredible. And just thinking about you and the depth

00:52:09.634 --> 00:52:16.693
- of knowledge that you have and the security that we have that you're delivering information that we

00:52:16.693 --> 00:52:23.963
- can believe seems to be- Or am I a deep fake? You don't know. Exactly. Or you're a plant of some kind.

00:52:23.963 --> 00:52:27.422
- That's right. But anyway, my question really is,

00:52:27.650 --> 00:52:35.603
- more sort of a very general question of your sense of whether or not disinformation and fake information

00:52:35.603 --> 00:52:43.329
- is propelling itself into our universe to the point where all of the information that you've provided

00:52:43.329 --> 00:52:50.979
- to us, all of the serious research, all of the dedicated commitment that you've had is being sort of

00:52:50.979 --> 00:52:55.902
- washed away by this alternative reality that seems to be sort of

00:52:56.066 --> 00:53:03.038
- like the slop that's coming towards us. Yeah. Yeah. I mean, I would still like to think, you know, Alan,

00:53:03.038 --> 00:53:09.878
- that this stuff resonates and is broadly useful, but you're right. It is an echo chamber. And in a lot

00:53:09.878 --> 00:53:17.182
- of these, especially on social media, those algorithms and how they're feeding all of us matter a lot, right?

00:53:17.538 --> 00:53:22.977
- So I think it's up to us in higher education to inform and push back and burst those bubbles and to

00:53:22.977 --> 00:53:28.470
- shine a light that they exist because oftentimes it is just too easy to fall into a groove and just,

00:53:28.470 --> 00:53:34.181
- you know, be very, very happy with your own little echo chamber, right? So that's why I think, you know,

00:53:34.181 --> 00:53:39.294
- work like this, again, that's open access and freely available is, you know, super important.

00:53:39.490 --> 00:53:47.258
- So I would appreciate that chance to, you know, talk about it and would love to share, you know, preach

00:53:47.258 --> 00:53:54.727
- the gospel as it were to any other folks in groups who, you know, might come to mind too. So in the

00:53:54.727 --> 00:54:00.478
- Netherlands, what happened with that election? Was it accurate? Do you know?

00:54:00.962 --> 00:54:07.494
- It was okay, but they were able to do it in part again because they had the system set up to do that

00:54:07.494 --> 00:54:13.962
- hand counting. It's a different scale than we're used to, but it was not a problem because they got

00:54:13.962 --> 00:54:20.558
- the intelligence in advance and they had the backup processes available. So that was a good news tale

00:54:20.558 --> 00:54:22.046
- about what was caught.

00:54:22.146 --> 00:54:27.505
- That was OK. There have been a lot of other attempts, as you would guess, across Europe, especially

00:54:27.505 --> 00:54:32.971
- given the ongoing war there, to undermine confidence and manipulate the results of a lot of different

00:54:32.971 --> 00:54:38.651
- elections. But they have something called a cooperation group in the EU, where basically all the election

00:54:38.651 --> 00:54:44.117
- officials across the entirety of the EU get together regularly, share best practices, and they invest

00:54:44.117 --> 00:54:46.046
- a ton in kind of raising all boats.

00:54:46.146 --> 00:54:52.574
- And that's mirrored as well. They have something called the Digital Services Act. And that's the thing

00:54:52.574 --> 00:54:59.002
- that has really rigid requirements for the tech platforms. And if they breach them, up to 4% of global

00:54:59.002 --> 00:55:05.430
- total revenue is on the line. So they stand up and they take it seriously. So even in the war in Gaza,

00:55:05.430 --> 00:55:11.671
- that was mostly regulated by the EU through the Digital Services Act in terms of what was being fed

00:55:11.671 --> 00:55:15.166
- and disinformation and deepfakes online. Last question.

00:55:15.490 --> 00:55:23.191
- long refrain that the US innovates while the EU regulates. And this is another example of that. So you

00:55:23.191 --> 00:55:30.966
- talked a bit about just the, and I'm seeing this intensely in social media that the amount of fake news

00:55:30.966 --> 00:55:38.891
- has kicked up tremendously even in the last few months. Like just, I'm seeing stories that are completely

00:55:38.891 --> 00:55:45.470
- fabricated and written in a way that looks authentic and it's tricking a lot of people.

00:55:45.570 --> 00:55:52.976
- on all political sides. And what I'm wondering about from the corporate side of things is does I am

00:55:52.976 --> 00:56:00.679
- under this suspicion that that financially benefits these social media companies and so they don't have

00:56:00.679 --> 00:56:08.382
- an interest to change it and fix it. But what I'm curious is if is the technology available to actually

00:56:08.962 --> 00:56:15.735
- limit, strictly limit that or even eliminate it? Like, does that exist at this point? And could they

00:56:15.735 --> 00:56:22.643
- implement it if they were had the will or were forced to? Yeah, so unfortunately, we are headed in the

00:56:22.643 --> 00:56:29.551
- wrong direction and a lot of these platforms. So a lot of the trust and safety teams have been gutted,

00:56:29.551 --> 00:56:36.190
- especially at formerly Twitter, now X, but even at, you know, Google, even at Microsoft, right? So

00:56:36.418 --> 00:56:42.963
- That's a problematic starting point, because the teams that were there that had the institutional knowledge

00:56:42.963 --> 00:56:49.326
- and had the memory of how this was done, a lot of them are gone now. So that's the starting point, which

00:56:49.326 --> 00:56:55.387
- is a tough one. Can you do it is another really good question. So the utility and the technology of

00:56:55.387 --> 00:57:01.568
- how we spot AI deepfakes, it kind of varies per application. In short, we're good at pictures. We can

00:57:01.568 --> 00:57:06.174
- spot that pretty well. Not so good at sound, and we're really bad at video.

00:57:06.338 --> 00:57:11.995
- believe it or not, in terms of automatically. We can see it oftentimes with our naked eyes, better than

00:57:11.995 --> 00:57:17.490
- the actual computer vision, the systems that are monitoring for this stuff. So that means we can get

00:57:17.490 --> 00:57:23.256
- ahead of some of these challenges more easily than others. And there's some legislation around the world,

00:57:23.256 --> 00:57:27.934
- but there's also, on the norm building side, some efforts to require basically labels

00:57:28.162 --> 00:57:34.162
- kind of like we have for other things. You picture like Energy Star or something that tells you something

00:57:34.162 --> 00:57:40.275
- meaningful about the energy use or the privacy and security of a product. Do that for AI generated content,

00:57:40.275 --> 00:57:46.105
- right? And there's no reason the state couldn't do that, right? And potentially, you know, the federal

00:57:46.105 --> 00:57:51.765
- government could at some point come into that too. As you could guess, it's tricky for sure. And if

00:57:51.765 --> 00:57:56.350
- it happens, it's gonna happen in the EU first. That was too much doom and gloom.

00:57:58.210 --> 00:58:05.210
- Sorry. Thank you all. Scott, thank you for a very interesting presentation. Your credit to our club.

00:58:05.210 --> 00:58:12.488
- In honor of your talk, a donation will be made this quarter to Amethyst House. I'd like to thank today's

00:58:12.488 --> 00:58:19.557
- volunteers, Diana Hoffman, Hank Walter, Alann Barker wearing three hats, Joy Harder, Jeff Richardson,

00:58:19.557 --> 00:58:25.310
- Michael Shermas. Next regular meeting will be here in the Georgian Room next week.

00:58:25.890 --> 00:58:33.982
- Audrey McCluskey will speak to us about capturing joy, a childhood in Jim Crow America. Tyler, if you

00:58:33.982 --> 00:58:42.391
- would put up the graphic for the four-way test and please stand if you're able and join me. Of the things

00:58:42.391 --> 00:58:50.404
- we think, say or do, first, is it the truth? Second, is it fair to all concern? Third, will it build

00:58:50.404 --> 00:58:53.022
- goodwill and better friendships?

00:58:53.506 --> 00:58:58.302
- Fourth will be beneficial to all concerned. And fifth, is it fun?
