21 Sep September 23, 2019 – Better Online Learning Marisa Murgatroyd and AI Marketing William Ammerman
“The audio file was removed when we switched hosts. Sorry. The cost was prohibitive. If you need the file, contact us and we will send it.”
Marisa Murgatroyd – CEO & Founder at Live Your Message with Marisa Murgatroyd – Be the Superhero to Your Tribe
The thing about making money while you sleep is that
before you can make money while you sleep, first you
have to make money while you are awake!
Marisa Murgatroyd started making movies when she was 17 and, in her early 20s, shot her first full-length documentary in the Brazilian Amazon. She then went on to produce dozens of web, video, design and interactive media projects for clients like the Getty Museum, PBS, UCLA, Room to Read, and the State of California. Cumulatively, these projects have moved over 70,000 units, generated over $800,000 in sales, and won 30+ industry awards. Today, she is the founder of Live Your Message and Experience Product Masterclass where she helps people who facilitate online programs get better and faster results for their students. She has always used media to understand and magnify her experience of the world. After winning multiple awards in creative writing and photography as a high school student, she went on to study the art and science of media production at Brown University and got a Masters in communication design at a prestigious London art school.
William Ammerman – Digital Marketing Executive, Executive Vice President of Digital Media at Engaged Media Inc. and Author of The Invisible Brand: Marketing in the Age of Automation, Big Data, and Machine Learning – Read interview highlights here
I came up with the term psychotechnology to explain that we are
now dealing with technology that changes us psychologically.
William Ammerman leads the digital transformation of Engaged Media and its premium publishing brands. He is also publishing a daily blog focused on marketing in the age of artificial intelligence. The blog is available at www.marketingandai.com. He wrote The Invisible Brand because he show how advances in artificial intelligence are impacting you through the world of marketing and advertising. He has a background as an advertising executive, coupled with post-graduate work in artificial intelligence, gives him a unique perspective on the subject of marketing in the age of AI.
Highlights from William’s Interview
Your first marketing P might be psychotechnology. I came up with that term, because I was trying to find a way to explain that we are now dealing with technology that changes us psychologically. When I created a contraction of psychological and technology, I came up with psychotechnology to describe the convergence of a lot of trends in marketing that I’d love to talk about, that really are changing the very nature of what we do, how we persuade, and how we change the way consumers think and what they buy.
Let’s look at the trends that are happening already. First of all, we have the personalization of information. We look at a laptop screen, and we flip it open to the New York Times, as an example, and I might see different ads than you would see based on who we are, what we’ve done, and where we’ve been, but more important, the process of getting information is being customized and tailored to our individual tastes. Your Facebook feed is different from my Facebook feed, the information that you seek out is being tailored to your tastes and interests based on previous behaviors. And increasingly, computers are being used to actually write the copy that you read. A company called narrative sciences uses natural language processing to actually generate headlines to generate news stories on the fly for box scores, for summarizing baseball games, or for summarizing stock quotes. So the first trend is that information is personalized, just for you, and that’s a big difference from the days of mass broadcasting. When I went to college, everybody got the same message off the same radio station, the same telling me tower, the same magazine, the same newspaper. Today, we’re all getting different information tailored to our needs. That’s trend number one.
We are in an age where people choose the facts that they want to believe. I often jokingly say we like the lies that we want; we tend to go for the information that confirms our existing biases. And because the algorithms are designed to serve up what we want, we surround ourselves in a little bubble, an echo chamber of our own desires. So the truth that comes to us through the internet is the truth that we’re seeking. It’s the information that confirms what we want to believe. That puts us into very polarized little echo chambers of our own creation, filter bubbles, if you will. The algorithmic technology that’s delivering information to us is intentionally putting that information in front of us because we like it. If we don’t like it, we don’t consume it. So the algorithm avoids giving us information that we don’t like.
Without crossing the creepy line, a lot of this technology isn’t necessarily spying on you in the darkness, what it’s doing is, it’s making inferential assumptions about who you are based on other people who have similar signals in their data, where we know their political affiliation. Political affiliation in the United States is a matter of voter registration, it’s public knowledge, is public information if you’re registered as a Republican, or if you’re registered as a Democrat. So what we can do is, we can look at the signals and people who are registered Democrats, we can take data at a very high level, and we can look for patterns in that data that exists of registered democrats versus patterns in the data of registered Republicans, and we can try to look for signals that are highly indicative of being a Democrat or being a Republican. And then when we have somebody who we don’t know their political affiliation, we look for the signals in their data, and we try to make an educated guess about their political affiliation based on what other people are like. We call those lookalike audiences. That’s a very common tactic, not just for judging your political affiliation, but for trying to decide whether you’re about to buy a car, or whether you’re in market for a new house, or whether you’re a college student. There’s a lot of tools that are available to us for leveraging data to make assumptions about where you are at certain stages of a marketing customer decision journey. I’m not sure what signals exactly they were looking at, but I can tell you for certain that’s the methodology.
What’s important, what’s happening about that is that when we are trying to persuade you, we have a few different levers and knobs and buttons that we can push to change the way you think. Persuasion has become not only something that is practiced at a personalized level, but it’s practiced scientifically. We have now reached a point where we are AB testing messages to see which ones work on you. And we have a different message for Bill and a different message for your sister and your aunt, they’re all receiving different messages based on the technology being able to change and optimize the messaging, optimize the images, optimize even the financial offers that are being presented, in order to persuade you about something. The second major trend is that we’re no longer just putting our wet finger in the wind trying to figure out which way the wind is blowing. We’re actually using science to pursue. You saw the science of persuasion has grown as an academic field. People like BJ Fogg, Clifford Nass have made tremendous academic advancements in this field. And we’re now at an age where persuasion is a science. So that’s the second major trend contributing to the emergence of psychotechnology.
Number three, the ability for machines to learn. Machine learning is really new. We’ve used statistics for a very long time, we’ve used statistics that can actually plot information on a chart, and you can make some predictive assumptions about what the future might hold, given this scatter plot of these variables. But what’s really new is that now we’ve got computers that can go find the patterns, but not only find the patterns in the data, but they can actually prescriptively change the outcomes, and that’s very different. Instead of just predicting, we’re now prescribing the future, we’re now changing the outcomes by testing different messaging, by testing and optimizing advertising campaigns, marketing campaigns, towards some key performance indicators. The ability for machines to learn how to persuade you using personalized information is genuinely new. Previously, a human being might sit in a focus group and test some messaging with six other people in the focus group, and they come out with some ideas about, what did you like, what did you not like about our ad campaign? Now, we’ve got Google, which is like a global focus group with 300 million people testing a message, testing an idea. And so the ability for the machine to learn, to figure out how to persuade you using personalized information is trend number three.
The fourth thing that ties them all together, is that we are now talking to our devices. We’re talking to our televisions, we’re talking to our cars, we’re talking to our mobile devices. And the questions that we’re asking today are fairly basic directions to the nearest Thai restaurant, or what’s the recommendation for a scary movie. Tomorrow, we’re going to be asking more sophisticated questions, we’re going to be asking, what kind of car should I buy? What kind of health insurance should I own? What kind of jobs should I seek? And who should I date and who should I marry? And it’s a moment where we are opening ourselves up to that level of persuasion. We have to be a little careful to understand who it is that we’re speaking to. We have to see that there is somebody on the other end through that device, trying to change us and trying to persuade us, and that’s why I called the book The Invisible Brand, because it’s using psychotechnology through our devices to change the way we think and what we do. Because we’re now talking to devices, we’re talking and being persuaded in natural language. And that opens us up to a whole new level of persuasion.
It goes far beyond kind of the very obvious, active voice that’s telling you what to do. It’s the fact that we are now asking for recommendations, and when we ask for recommendations, the advice that we’re getting back on the other side is there for a reason. There are brands, there are interests, there are institutions and politicians that want to answer your questions in the way that they want to frame it. And so as we become more vulnerable to this level of persuasion, they’re going to want to have an increasing influence over what are those voices? What are those answers?
I’ll give you a really interesting anecdote. There was a study that was done very recently, where a group of study participants was engaged in a conversation with a little robot. And at the end of the conversation, the control group was told to turn the robot off. Of course, they just reached over and turned off the robot. In the study group, they were challenged by the robot who said, “Please don’t do turn me off. I’m scared of the dark.” Now, guess what? A frightening number of people in the study refused to actually turn the robot off, because they were very empathetic, they were relating to that robot. I was scared of the dark as a little kid, I don’t want that robot to be scared of the dark, I can relate. I’m empathetic to that experience. Well, the robot wasn’t experiencing being afraid of the dark, it was programmed to say that, but we as humans project our own empathy onto that device, and that makes us vulnerable to persuasion. The more we interact with our devices using speech, the more vulnerable we become, because speech is something very innate, it’s something deep inside us. We don’t learn to read until we’re four or five, six years old, but we learned to speak when we’re one and a half years old, and that is deep inside who we are as human beings. It’s deep programming inside us. Tapping into that programming, using language is an opportunity for marketers that I think is yet to be fully explored and understood. So I gave it a name. I said, okay, we’re now talking to machines that are designed to learn how to persuade us using personalized information. You and I need a word to describe that, to be able to talk about that. And that’s why I created the word psychotechnology.
I would like to say that if you’re into being scared of things, I’ll tell you something that you should be scared of. In China today, they are experimenting with a social credit system. The credit system gives people access to transportation and housing and food and other benefits based on their social credit score. Now, this is a form of currency, and the currency is controlled by the government, determining whether you are a worthy Chinese citizen, if you are a bad person in the eyes of the Communist Party, if you say the wrong things, if you get out of line, if you protest the government, your social credit score goes down. This becomes a tool for oppression at a massive level. And I’m telling you, if there’s anything to be scared of, that’s where you should be focusing your time and energy. We’ve got to think really hard about how to prevent that level of suppression. That’s serious. Now, I want to say I’m not here to scare everybody. I’m not here to turn everybody into Luddites walking around in tinfoil hats. What I really want is to say there are opportunities and risks and we have to educate ourselves so that people have a better understanding of the technology and what’s happening.
I’d like to share one positive before we get ourselves too worked up about the terrible things, one positive opportunity. And that is that persuasion could do us a lot of good if it were designed to persuade us to live a healthier, better lifestyle. Healthcare is an opportunity, where there really is true benefit to thinking that persuasive technology might help us break bad habits. If we’ve got a nicotine dependency, if we’re addicted to something, there are true benefits to being able to overcome those addictions. I gave an example in the book of an imaginary pair of shoes that has sensors built into it that detects when you have changed the way you walk, because the way we walk can be indicative of healthcare changes. We see Parkinson’s patients, Alzheimer’s patients who change their gait, they walk with their hands limp at their sides or they arch back, and they kind of saunter shuffle their feet. If we can start to see those patterns emerging earlier, we might be able to mitigate the long term damage of these diseases, we might make discoveries about how to, if not cure them, at least prevent some of the worst side effects. We look at diseases like diabetes, they tend to attack the extremities, your feet are very vulnerable, and maybe a pair of shoes with sensors could help determine that you are developing diabetes, and that there are things that you should be doing to change your diet, to change your exercise routine, to change your lifestyle to prevent that disease from continuing its devastating course in your life. There are going to be tremendous advances as a result of being able to aggregate data, and to leverage persuasion to change habits and to change our worst instincts and our worst behaviors. I think that we can all agree that if we could live a healthier life, it would be worth wearing that pair of shoes. That would be a genuine benefit of this technology; there’s going to be tremendous benefits. There’s going to be benefits in education, we’re going to figure out how to teach you faster based on your learning style. We’re going to improve our healthcare system, we’re going to make advances in the arts and sciences in places that, we haven’t begun to tap the full potential. There’s opportunity here that’s boundless, that’s tremendous.
Tell Alexa to buy The Invisible Brand by William Ammerman. Or you can go visit my website, which is wammerman.com. You’ll find out all about the book and where you can learn more about psychotechnology. I’d love for you to read The Invisible Brand. It’s a great book.