A History of Marketing / Episode 23
This week, I'm joined by Bill Moult, an entrepreneur and executive who spent his career at the center of the marketing data revolution. Bill has served as CEO and President of companies central to advertising research, new product forecasting, brand metrics, and media measurement.
Bill also served as President of the Marketing Science Institute, an influential non-profit that bridges the gap between marketing academics and practitioners to drive business performance.
Listen to the podcast: Spotify / Apple Podcasts / YouTube Podcasts
In this conversation, we explore Bill’s historic career and learn timeless lessons in leadership and marketing insights. We discuss:
The evolution of new product forecasting, from expensive regional test markets to the sophisticated "simulated test markets" pioneered by his company, BASES.
How scanner panel data and advanced modeling allowed companies to accurately forecast sales for new products before they even hit the shelves.
A high-stakes case study of a major CPG company that ignored a data-driven forecast and lost $100 million on a failed product launch.
The shift in advertising research from measuring single metrics like "recall" to integrated models that incorporate persuasion and predict in-market sales.
The role of the Marketing Science Institute (MSI) in connecting academic and business worlds, and how it served as an early platform for ideas like Clayton Christensen's "disruptive innovation" and David Aaker’s “brand equity”
An inside look at how Nielsen actually works
Note - I use AI to transcribe the audio of my conversations. I review the output but it’s possible there are errors I missed. Parts of this transcript have been edited for clarity.
Andrew Mitrak: Bill Moult, welcome to A History of Marketing.
Bill Moult: Well, thank you. I appreciate it, Andrew. I am delighted to be part of it. I think what you've done already is really ambitious, and if I can help in any way, that'd be great.
Andrew Mitrak: I'm sure you can. I'm so excited to speak with you. You've had this amazing career as a marketing research executive and an entrepreneur, and you've intersected with foundational organizations: Ipsos, which is one of the world's leading market research firms; Nielsen, which is a name everybody's familiar with, the top analytics firm; and the Marketing Science Institute, a non-profit that's been brought up on the show in previous episodes, that brings together leading marketing practitioners and academics into one forum.
I'm hoping to use this conversation to cover your career, the life and times of Bill Moult, and use your career in marketing to explore these institutions and the evolution of marketing research and the industry more broadly. Does that sound good?
Bill Moult: Fair enough.
The Influence of Philip Kotler and Early Marketing Science
Bill Moult: You and I were talking before this video started, but I'm going to come back to your first interview, which was with Philip Kotler. Which, I would say, Andrew, was a great choice. I'm fortunate to know Phil from my Marketing Science Institute days. The reason it was a great choice was, he had gone through a much more intense educational experience than I did and studied under Milton Friedman at the University of Chicago in economics. Then he went for his doctorate at MIT where he studied under Samuelson. I mean, both giants, obviously. Phil could have worked anywhere because he has smarts, he had drive, he has charisma, and he certainly learned a heck of a lot about economics. But he chose to work in marketing. The reason he did that, in part, was because he recognized that there was some science in marketing.
Over time, there's been a lot more science developed in marketing and marketing science, in particular, as a part of the field. I worked in marketing science for almost my entire career, in different kinds of marketing science. It was during periods of time when those different parts of the industry were starting to develop. I never had an interest in a mature business. I was too much of a risk-taker, so I liked stuff that was either a startup or a turnaround. Fortunately for me, there were plenty of those opportunities that happened in the marketing science field.
I felt that my, starting with my work at Wharton, it prepared me for taking a scientific view of marketing, marketing research, and marketing analytics. That's pretty much lasted through much of my career.
Andrew Mitrak: I was very fortunate to speak to Phil first. Had he not responded so quickly and wanted to do an interview with me, who knows if this podcast would even be happening. Both for his contributions to the field— his textbook was the first major textbook that I had in marketing—and his impact on this podcast.
Bill Moult: Yeah, I am very, very grateful for Phil Kotler. Do you know what edition you had?
Andrew Mitrak: Fourteen.
Bill Moult: I had two. The second edition. Now, when he wrote his first edition, I was in the Marine Corps, I was in '67, so it wasn't too long after he did the second edition. I am really humiliated that yours is much more recent.
BASES: From Booz Allen to a Leading Forecasting Firm
Andrew Mitrak: (Laughs) Back to your career and your journey. You went into the private sector, and the company that you joined and became president of was called BASES. You built this organization, you led this research organization, and I think this was right around 1980 that you joined and you grew it throughout the 1980s. Can you just share what BASES was and what drew you to this organization?
Bill Moult: I wouldn't say I built it. I joined it very early. It's very important to say where it came from. There were three or four key executives at Booz Allen. BASES actually stands for Booz Allen Sales Estimation System. One of them was Jack Brown. One of them was Lynn Lin, who is a very, very talented statistician, mathematician, econometrician. One of them was Regg Rhodes. They started BASES as a business. Then they asked me to join. I was there three months and they asked me to become the president. It was maybe my favorite organization in my career. I was there about 10 years. We took it from, I wasn't there to take it from zero to one or zero to two, but I was there to manage it from 2 million to 20 in the first six years I was there. Then it grew again by a factor of 10 and more after that and became the leading new product testing and forecasting company in the world. It was a real success story.
Bill Moult: We weren't the first one in the business. We had some very good competitors, but we managed to do a lot of things pretty well, and we relied very heavily on the data that was available, not just to our company, but to our sister company, AdTel, a split cable testing company, and others that had this amazing new scanner data and scanner panel data that was coming on the market. We figured out how, Lynn Lin really figured out how to make some magic out of the depth of that data. Working with things like what was called a repeat decay curve, trial repeat, purchase cycle, and all that kind of stuff. Ended up being able to forecast new product sales really well.
Navigating the New Product Landscape in Consumer Packaged Goods
Bill Moult: The new product business, particularly in consumer packaged goods (CPG), which is where some of our sophisticated clients were, went through a state, went through moving from very big regional test markets, which were hugely expensive, to smaller regionals, then to large individual markets like Charleston, West Virginia, and Orlando, Florida, which AdTel did and so forth. Then BehaviorScan (associated with IRI) came along and said, "Let's make them really small market." This is Information Resources in Chicago. They went to very, very small markets, about eight or 10 around the country, and figured out how to make it effective even on a much smaller scale. Meanwhile, the stuff that BASES worked on and competitors of ours were called simulated test markets. They figured out how to do this without even a test market, but with getting very deep survey research, very extensive analysis, and were able to forecast sales before the product was even ready to be in the marketplace.
That took off because it was economically so viable and such tremendous leverage. But the other reason it took off was, when people went to test market, there was something important that happened. They were so committed to this thing as a company – these are the clients like Procter & Gamble and Kraft Foods and so forth – that no matter what happened, they were going to go national. That was a really bad idea and it lost them a lot of money. The earlier you could make these things provide answers and predictions, the more likely they're actually able to drive the prediction as opposed to not really end up driving a good decision.
Product Demand Forecasting Examples
Andrew Mitrak: Let's dive into a product example that BASES would work on for a Procter & Gamble or General Mills or one of those large CPG companies. What if you could talk like, okay, you've met the client, they're introducing a new product, they want to understand the market viability and product forecasting for it. Then what? Is there a specific example that you think could illustrate the journey for this?
Bill Moult: I probably won't, even with this 40 years in between, I probably, I don't think the statute of limitations is even that short for brand specific stuff. I will tell you my favorite client was Procter & Gamble. I can probably talk about some things that happened there without disclosing the brand. You would develop a concept or product, and they'd have to have enough supply of the product that they could actually put it into the hands of two or 300, four or 500 people and have them use it in their homes for two weeks and then come back and respond. You have a chance to see, how do they respond, how do they react to the individual advertising position? Could be a board, or it could be a video ad or whatever.
Andrew Mitrak: Just to ground this. Let's say this, for the sake of argument, it's a laundry detergent. Like say Procter & Gamble or whatever large company, they're introducing new laundry detergent and they want to see, do the consumers buy this? Like it? Use it? And then they come to BASES to help figure that out. Is that the right way to think about it?
Bill Moult: Yeah. They would have, let's say in that if that were the product, they would have a product that they developed or a concept that they developed. Usually you'd have both a concept and a product and you say, okay, this is the way we're going to advertise it. This is a board that shows how we're going to advertise it. This is positioning. It may have to do with cleaning, may have to do with scent, may have to do with sustainability, it could be anything. Is that an attractive enough idea that somebody's going to actually buy it? You go through the process of determining that, whether somebody's likely to buy it. Then among those, a subset of those people that are most likely to buy the product, you actually give them the product. In many cases, it's just a white box with black printing on it. It doesn't have the final stuff. You give them a chance to use it in their home for two weeks, typically. They would go home in that case and they would try it on their laundry. Then we would call them back, as agreed, they knew we were going to be calling them back and ask them, "Well, okay, what do you think of this? Would you buy it if it were available?" and so forth. The first part is about trying to figure out whether they would try it the very first time. The second part is about whether they would rebuy. The third part goes on and on for this thing called the repeat decay curve, and the purchase and purchase variability and the package sizes, all that kind of stuff. You put it all in a model, you crunch, crank it out, and you give them an answer. Both of those things, they're based on empirical research that says when people say they're going to definitely buy something, actually, a lot of them don't. After use, there's also you ask questions like, "definitely, probably would buy," and so forth. The same thing is true, but the percentages are totally different. You need to learn how to do that by looking at literally hundreds of new product introductions and you link these databases.
How Ignoring Forecasting Data Cost a Client $100M
Bill Moult: One that I'll give you, I won't say specifically what the company or the product was. We did this product and we knew that this was a very, very high-profile product and an initiative within the company. I was presenting it myself. We go into this presentation of the outcome, and usually there might be five or eight people in the room or whatever. There were 40. We had heard, we had already read, you could read in the press that the world knew that this company was going to be introducing into this new sector for them, and it was a big deal. Once I saw 40 people in the audience, it kind of filled out the story.
One of the most, so one of the things that happens, you have to have the concept, how you're going to present it. You have to have the product so they can take it in their home and use it and so forth. You actually have to have a very, very good understanding of what they're, how they're going to introduce this product. How much money they're going to spend, how they're going to advertise, how they're going to promote, and so forth. It's like, I don't know, 25 pages long as a detailed plan. It's really hard for them to do it. It was an unpopular part of the process, but an essential part of the project process. They did that, and they laid the entire thing out.
The single most important thing in that marketing plan was one question we asked: "What is your minimum business requirement? What will it take for you to get in your first year of retail availability, how much sales will it take for you to say, 'Okay, this is worth doing'?" We focus on that number a lot. We did this product, and the concept wasn't bad, pretty good.
The product wasn't bad, it was pretty good. But it couldn't deliver the minimum business requirement. I'm giving this presentation, and I'm reading the tea leaves in this room. They sort of appreciated the learning, but they didn't appreciate my conclusion at all.
I said, "It's not a bad." I said, "You got to understand, this is not a bad concept. It's not a bad product. You have no chance of achieving your objective." Then I said, "In fact, I'm begging you not to introduce this product in the way you're doing it." About two weeks later, there's an announcement in the press that this company, well-known, respected company, has decided that they're going to go into a test market with this product.
I was elated because I knew they go into a test market and they might lose 2 million bucks, but they wouldn't go national and lose $100 million. I was thrilled. Then a month later, they introduced nationally. They even admitted at the time that the test market was a head fake. They were trying to throw off their competition. They did go into the market and they did lose $100 million.
Andrew Mitrak: Wow. Gosh, it's a shame you can't tell me which company this is because it's an interesting story.
When Forecast Data Doesn’t Favor Your Product
Andrew Mitrak: As a forecaster, you have to tell clients what they don't want to hear sometimes, and it's up to them whether they listen to your advice. That was a company that didn't listen to your advice. Are there any examples of companies that, taking the laundry detergent example, say, "Hey, we did the study, this doesn't clean the clothes well enough, go back to the drawing board."
Or do they change the product? Do they change the advertising? Do they change something else? Do they decide to just kill that one and do something different? What are the outcomes that a large company might do?
Bill Moult: You must have read stuff. All of the above, Andrew. All those different things happened.
The thing that was nice about BASES was it was early enough and it was inexpensive enough that they could iterate and they could go through the stage more than once. Yeah, there's some huge, a lot of the products introduced during the 80s, 90s, whatever, a lot of the most successful products had been BASES tested. We couldn't tell anyone they were BASES tested, but sometimes the clients did. They would even go into the retailers and say, "You should give us some credit for the fact that we've tested this thing carefully. Here are our BASES results, so we know this is going to be a successful product. Therefore, you want to put it on the shelves here of your retail outlets."
The rise of the Product Forecasting Industry
Andrew Mitrak: Was BASES a new category of forecasting company because of new methodology or new technology? Or did companies like BASES exist before? What did companies like Procter & Gamble use before BASES and what did BASES do differently? Can you frame BASES in a historical sense of where it fit in the evolution of forecasting?
Bill Moult: I love that question. I love it because we were not the first. BASES was not the first. The first one was Yankelovich LTM. The second one, but what became the most successful before we entered the business, was Assessor. MDS Assessor was, MDS was a company in Cambridge, Mass. associated with MIT, and they, really smart people. Then ESP was the model that came out of NPD. NPD was one of the leading, one of the first and for many years, one of the leading companies in the panel data business. What panel data is, is to look at not purchases by store, but by household. How does that happen over time? That's how you can get this idea. We were actually the fourth one to enter the business, but we were committed to using the best available data, and the data that was coming along with scanner data, scanner panel data was amazing. There weren't that many people that knew how to, that had it or knew how to use it effectively. There was no one else that had Lynn Lin, who really built these very sophisticated simulations. We were the, maybe the fourth one to enter the business, but we probably had or used the best available data. We were also in the AdTel business, which was a split cable business. The person in your house and the person next to your house might actually receive different advertising messages by the split cable, and you could see the responses that they would have over time, or receive more advertising or less advertising and so forth. That was the test market, the electronic test marketing business and AdTel and BehaviorScan did that. But they had just fantastic data.
Understanding Scanner Panel Data
Andrew Mitrak: You mentioned the source for some of this data was scanner panel data. Scanner panels are something I've heard brought up on this show a couple of times, and I've visited the Wikipedia page a bit for it. But for listeners who might be hearing of a scanner panel for the first time, at a very nuts and bolts level, what exactly was a scanner panel? How did that data get collected?
Bill Moult: Okay, first of all, the scanner refers to UPC scanners at the supermarket. Prior to that, at the supermarket, you had a sales clerk checking things out and it was very, very labor intensive and so forth. Somebody figured out how to develop a little UPC code, and so it could be scanned in a second rather than half a minute to get the complete identification. The industry really got into that because it worked really well for retailers. If it works well for retailers, you got to figure out a way to make it work for the people that sell to retailers. That was the UPC. It used to be audit data or warehouse withdrawal data, but scanner data allowed you to get very, very detailed data, not only for a market, but for a store.
Taking it beyond the store, you could recruit people to participate in panels and they might present a card when they check out and get a little bit of discount for checking out. The real purpose of the card is to have an identification so you know regardless of where that person shops at many different stores and chains in their market, you know what they buy. You know if they bought it once, twice, or three times, and you know how often, you know what quantities they do. That's what a scanner panel is, is a group of thousands or tens of thousands of families who purchase information that is directly linked to the stores and to the individual products. You start with a scanner, it becomes a panel when people are recruited to participate. They might participate just by presenting a card or the initiative they actually write stuff down.
Then there's another level too that I should mention beyond that, which is it's great if you get that data that has all their purchase information. But what about their exposure to advertising? Well, that's an entirely different set of data. If you can get that and incorporate it, it's called single source data. That's unbelievably powerful because then you have a chance to see, Andrew bought this, was exposed to this advertising, his neighbor didn't get exposed to it. This is the behavior that came down and how can I quantify it over time? Therefore I can predict the sales as well as understand what's really working and what's not working. Those are the different kinds of data and that's what, those are the reasons that models were changing so fast because the data was getting so good, and it was getting really, really complete. Then it went further later in the career that I wasn't as involved in single source data. A woman by the name of Leslie Wood did the best analysis on that. There was a project that was done, I'm jumping way ahead because that was what happened later in my own experience. Single source data ended up and continues to be a really powerful tool that's used particularly by consumer packaged goods.
The Convergence of Data and the Rise of ASI
Andrew Mitrak: This intersection of scanner data, panels, single source data, it was all coming together around the 1980s and creating innovation in the accuracy of forecasting with models like those that BASES was pioneering at the time. From BASES in the 80s, you went to ASI, which was an advertising research company where you became their president, their CEO. Can you tell me about this move and ASI and what advertising research looked like at the time?
Bill Moult: Every job I did, Andrew, was either a startup or a turnaround. This one was a turnaround.
When I joined, the people at ASI didn't know we needed a turnaround, but you could see it from the outside. The history of the advertising testing business, it had actually started a company called Burke, and it was really sponsored by, very visibly sponsored by Procter & Gamble. It measured what they called related recall. I show you an ad and ask you, ideally within the program, a television program, ask you to watch a program, agreed that you will talk to us a day later. What Burke did was call you a day later, ask you about the program, and they really weren't interested in your answers to that. Then they ask you about the advertising you remember from that program. In particular, what they really wanted to do is get what was called related recall. Does somebody remember an ad and are you able to correctly associate it with the brand that was actually being advertised? It's a very simple measure, very important measure. Procter & Gamble was pretty visible in supporting that and that was I think the most important reason why that became pretty much standard requirement for many advertisers. They really used one measure, which is that related recall measure.
Then another company came along, and the company was called either ARS or RSW, they actually had several different names. They also had one measure, but it was a different measure. It measure was called persuasion. They said what's really important is, am I going to change what you buy based on you seeing that ad? They would try to relate the exposure to the ad to the brand choice, not whether you remembered it, but whether you changed your brand choice. It's a good measure, but it was one measure. It was obvious to me shortly after I joined ASI, if not before, that ASI had very good people, but they were really losing their momentum. The reason was, this other guy came along and came up with a better measure. When I joined ASI, shortly after I joined, it became clear to me that we had to change. We had one company with one measure, another company with another measure. They are both good measures, but they're incomplete. I said, we need to have an integrated measurement of more than one dimension. We need some measure of recall and some measure of persuasion, and we need to put them together, ideally with some analytics. In order to do that, I have to walk away from the single measure system, which was very profitable and had a long history to it. It was probably the biggest risk that I took in my career for the company. We did take that risk.
This one I will tell, I'll tell you a Procter & Gamble story on this one because they're so important. We not only changed the system to have multiple measures, I made the decision that we would wait until we had an unbeatable combination in the business proposition, the analytics, and everything else before we go back to Procter & Gamble. Because ASI had some of their early Burke people as a part of the team at that point. They were familiar, but we waited five years. We went back after five years, we said, "We think we're ready to show you something that we think is better than anything that we've done before, but anything that anyone else has done before." It incorporates these multiple measures and it links them to actual sales response based on market mix modeling and a few other things like that. The same thing happened there that happened in all cases when the real thought leader client in the industry changes, people tend to follow. ASI managed to grow very dramatically after having gone through this difficult period. Then I went to the owners of ASI. These guys were great. Terry Elkes and Ken Gorman. Terry had been the CEO of Viacom before Sumner Redstone bought Viacom and ran it himself. I love these guys and they were really good to me and they were pleased to see what was happening with ASI and so forth. I said, "I need a meeting with you guys sometime soon." I met with the two of them and I said, "Terry, Ken," I said, "I've loved working with you guys. But I've got to tell you, I've come to the conclusion that to fill myself out as a business person, I really need some international experience. I know ASI can't provide that. We don't have the resources, the wherewithal to go international." For that reason, I'm giving you 18 months notice. Ken or Terry said to me, "Bill, thank you, we hear you, give us 24 hours to respond." I went, we met back 24 hours later, I had no idea whether they were just going to kick me out or what. Terry said, "Bill, we have decided that we would rather sell the company than replace our chief executive. You can decide who to sell the company to." I was blown away that they would say that. Fortunately, there were four companies in the industry who, once they knew we were available, who were very interested, and three of them were very willing to have this crazy president or CEO at that point have a tour in some international company. The one that ended up buying ASI was Ipsos, which is based in Paris. Not a bad choice. That's what we did. We sold it to Ipsos and I got a chance to spend some time in Paris.
Measuring Recall, Persuasion, and Response
Andrew Mitrak: There's a lot that you covered in this and I want to go back to what you were saying at the beginning of ASI, that they had been measuring one vector, which was recall. So how well could an ad be remembered at all? Important, but insufficient potentially. Then a separate competitor was emerging and they're measuring persuasion. Which is more about the ad's effectiveness in a way. You have to remember it and it has to persuade you for you to make a purchase. This is sort of building a more multi-dimensional model for ASI's product offering. Is that the right way to think about it? That you kind of took both those two and brought them together? Were there any other dimensions involved?
Bill Moult: Those two, but more importantly to linking it directly to response. I knew the people with the data, I knew the people that had the modeling. We brought in some fantastic modelers and we were able to not only use these two measures, but to combine them in a way that was empirically derived and related to actual sales effect. No one had done it before. That was a pretty good combination.
Andrew Mitrak: At ASI and advertising research companies more broadly, who were your main customers at this point? Were you primarily working for a CPG company like you were with BASES? Were you working with advertising agencies? Within customers, who was your key stakeholder? Was it a marketing department? Was it somebody who's overseeing advertising? Was it finance? What did that relationship look like or who was the primary connection point?
Bill Moult: It varied a little bit, but primarily the clients of those sorts of things, the lead clients of those sorts of things are the consumer packaged goods companies. The reason is, that's the industry that has the best data. They know how to get the most out of the data, and it requires these models and these capabilities and these talented people. Yeah, so the companies involved at ASI were primarily, and I would say this is also true for our biggest competitors, primarily consumer packaged goods companies. The part of the company, to answer that question, was either the research company, in some cases they changed the name to insights, or sometimes the marketing company, marketing part of the company. The agencies would usually be involved, the ad agencies, which were independent, would usually be involved, but they were kicking and screaming about it. No one likes to have their papers graded. So many of them were resistant at first until they realized that it actually helped the entire system, and some of them became really good users of research and in fact, good innovators of research. Both ASI and BASES were overwhelmingly consumer packaged goods.
Consolidation in the Market Research Industry
Andrew Mitrak: BASES was acquired by Nielsen, ASI was acquired by Ipsos. Some of the other companies you've mentioned that are in this industry have also been rolled up or merged or acquired. It seems like there's this gravitational pull towards consolidation in this type of industry. I'm wondering why is that? Why is it not more distributed across a lot of small companies? It seems like there's a trend towards rolling up and consolidation.
Bill Moult: I think both for data reasons and for financial reasons. I'll talk about Nielsen a little later because I think it makes more sense then because I think there's a couple of companies I worked for before I worked for Nielsen myself and I can comment in that context. Yes, there's been tremendous consolidation of the research industry, no question about it. I think it's generally been good, but it can go too far. In some cases, you have seen them consolidate and then they split up again. Nielsen's done that about three times.
The Marketing Science Institute: Bridging Academia and Practice
Andrew Mitrak: This leads us to the Marketing Science Institute, which you joined after ASI. Can you share about what the Marketing Science Institute is? It seems like a different type of organization than you've previously worked in. What attracted you to it and what was the organization doing?
Bill Moult: Oh, it's definitely different. Different than any others I've seen. I was first exposed when I was a doctoral student at Harvard Business School, I was exposed to MSI because MSI, the Marketing Science Institute, actually started at Wharton. Then it moved to Harvard. Then while I was at Harvard, they made the most important decision, I think, to make it independent so that it wasn't associated with any one school, but could work with any school. I was there to see parts of that evolution. A guy by the name of Stephen Greyser, who was the longest-serving executive director. It was an organization that deals with both business people and academics in the field of marketing. It turns out that if you think about those two groups of people, they're generally not all that interested in each other. Most people in business don't really care that much about academia except their own alma mater. Most people in academia, even in marketing, have relatively little interest in real life business. But if you get the few people that really are interested, you get that subset of the people together, you do amazing stuff.
Let me use this as a chance to jump into PIMS, because PIMS was, stands for Profit Impact of Market Strategy. It was a very, very ambitious project that started at General Electric (GE). Now, there was a point when General Electric had dozens, maybe hundreds of different businesses. They called them strategic business units, but they were a great consolidator to use that term. They were in all these businesses and they said, "Well, we're in all these businesses, we got to figure out what really makes a business successful, what makes it profitable? What is the profit impact of things that we do, including the marketing strategy?" That's what PIMS stands for, Profit Impact from Market Strategy, but it went way beyond marketing.
They said they did this analysis and they got all the different business units to fill out ridiculously long questionnaires about every detail in the business so they could do this big, sorry, they could do a large, sophisticated analysis.
It was a really cool idea, but then somebody finally realized, "Well, wait a second, we're only looking at General Electric's business. How are we going to know what really matters if we're not looking at any other businesses? We're not even looking at our competitors." They said, "We need to we believe in this project, we think we can learn from it and benefit from it and change our business as a result of it, but we have to include a lot of different strategic business units." We need to do it independent. They chose MSI as the company that could be an independent middleman for this process. Bob Buzzell did that process. That was the first time that, for example, a sophisticated analysis said, "Yeah, there's really a good reason for you to generate a high share rather than a low market share because there's a very strong financial link." You only know that if you build these sort of extensive, sophisticated, and broad-based analyses. That's what PIMS did.
But Marketing Science Institute did more generally was to get those marketing people and the marketing companies and the academics together doing research that made sense. For example, the business people set the agenda. They said, "You guys are doing all kinds of research that we think is wasted." At the same time, we really need this done. For example, we need to figure out what the World Wide Web is going to be all about. Can you guys focus on that and change the, set the agenda for the marketing people? Then the marketing or the academic people. The academic people would spend a lot of time, a lot of resources, a lot of smarts, and they would answer these questions. Usually it would take them years to answer them, but they'd answer them to the Marketing Science Institute. There was some really good stuff done.
The interesting thing was that people that were involved in the Marketing Science Institute, there was about 70 member companies and they were all the blue chip companies. At least as many leading academics involved. They would kind of do this stuff first.
One of the best examples was disruptive innovation. You know Clay Christensen. There was a book that Clay Christensen wrote that said, "This is the way you need to think about innovation and disruptive innovation in particular." It was very successful, it was a bestseller. Two years before he wrote that book, he wrote a Harvard Business Review article, basically said the same thing. Two years before he wrote the article, he presented it at the Marketing Science Institute. What was happening was people were getting involved very early on in the developing areas, and it didn't always happen quite that beautifully, but that was a good example of when it did. The Marketing Science Institute was a wonderful experience and it was unique because it brought the academic world and the business world together. It was hard to do. It's currently part of the ARF (Advertising Research Foundation), but it's still independent, largely.
Andrew Mitrak: That Clay Christensen example of disruptive innovation originating or having early beginnings at MSI is a great example. There are other ones that have come up on this podcast. I interviewed David Aaker who's really popularized the concept of brand equity and brought it to the masses and it defined a big part of Aaker's career. But he originally heard about it, on this episode he was talking, or on this podcast with him, he was talking about how he learned about brand equity at the 1988 MSI conference. You have these conferences that set a theme and bring major thinkers in marketing together and there would be presentations and papers published and an exchange of ideas and somebody could latch onto that and bring it out of this, out of MSI to marketing and business more broadly and actually have an impact.
Bill Moult: That's a great example. One other thing that happened at MSI I got to mention to you though. Dave, there was, the way it's organized is it's led by two people as peers. One who's a president, who's hopefully around there for years, and the other is an executive director who was an academic, and we tried to serve both groups. I was the president for a number of years and there were two different executive directors while I was there. But one of them said, after we'd been working together for a while, he said, "Bill," he says, "I, we got really smart people from the advertisers or the member companies. But they're all from market research or sometimes from market strategy and so forth, but they're really not the people that are doing the marketing for the company." He said, "Whereas on the academic side, we have all the right academics, the science-oriented ones, the quantitative and so forth. How do we fill the gap?" We identified an opportunity to make a connection with chief marketing officers. Now it was a relatively new concept.
I learned on your podcast that Sergio was the first CMO. Now I can say Sergio because he only needs a first name, he's like Madonna, the Madonna of the marketing world. He was the first one, I think you said in '93. A little bit less than 10 years later, we were able to make invitations to between 40 and 45 of the best CMOs, mostly in the United States, but a few international ones, and they almost all, once invited, came. We got them together with some academics. It didn't last forever, it went a few years and it was really good because it got the marketing people, not just the marketing research and the marketing analytics people, connected with the academics.
The Evergreen Challenge: Connecting Marketing Academia and Business
Andrew Mitrak: I think this idea of the intersection between academia – I've interviewed a lot of academics on this podcast – and business leaders, I've interviewed some of them too, is an interesting idea where I feel like it's an evergreen challenge to get them to speak and connect and communicate. Connecting academia to the real world is just an ongoing thing where it doesn't seem to happen as much within the field of marketing. If I think of other areas of the sciences, in biology, a breakthrough that's a published paper can all of a sudden spin up a new biomedical or pharma company that can really be productized. Similar in computer science, a white paper on AI can spark a whole revolution in a field. That just doesn't happen quite as often in marketing. Why do you think that is? Why do you think it seems to be an evergreen challenge that we needed institutes like MSI to bring these people together proactively because it's not happening naturally? Why do you think that is?
Bill Moult: I don't know. I share your frustration that it doesn't happen more often than it does. I think some of the things that happened in Silicon Valley, where, especially Stanford academics were very closely involved. I don't know that it just hasn't happened in that way in marketing or marketing research. One example is a guy by the name of Michael Ray, do you know him? Or do you know him by reputation? Okay, so he was a marketing professor that focused on advertising, but he went to Stanford or spent much of his career in Stanford, and he ended up being a guru in the area of innovation. For whatever reason, he was interacting with a lot of the right companies in Silicon Valley and they were a heck of a lot more interested in innovation than they were in advertising. He basically changed his focus as an academic guy. I think there was a different environment there. No, it's frustrating, but I also think it represents an opportunity. Some people make it work really well.
Nielsen: What do they actually do?
Andrew Mitrak: I'm going to jump ahead a little bit and talk about Nielsen because you were the president of Nielsen Media Analytics and Knowledge Collaborations. You were also the CEO of a company called Interscope Research, which you later sold to Nielsen, and you've had advisory roles within Nielsen. Nielsen is one of those companies that even before I got into marketing, you'd hear about Nielsen, you'd hear about Nielsen ratings. I think a lot of people have heard about Nielsen, but it has always been a little bit of a mystery to me what they actually do at the end of the day, how they actually work. Can you help demystify Nielsen or speak to your time there? What do they actually do? What was your role within Nielsen as well?
Bill Moult: That's a tall order, Andrew. I was hoping you'd demystify it for me. Nielsen's an amazing company or companies. It started really by two guys that were phenomenal in terms of their impact, AC Nielsen and then AC Nielsen Jr.
What they did, among other things, they created the concept of market share. They said, "All these companies are doing their own thing, they're selling their product, whatever, they think they're doing well, they don't know if they're doing well." They really needed to understand whatever else is selling, not just what their company is selling, but everyone that competes with them. We're better off if we can gather data that measures the sales not just of your company, but every company that's in that category, and then compares them and tells you not just what are you doing in terms of sales, what are you doing in terms of market share? Market share turns out to be a very strong concept. They did that starting with, this is way before scanner data, they started with audits, and they did those audits, and then they succeeded in that business to the point that everybody felt that that was required really to run their business, they would have to get that data.
Then they found that the same opportunity existed in media. It wasn't enough to just know something about your viewership, you needed to understand competitive viewership or listenership or whatever it is in order to do the other. They started Nielsen Market Research, Nielsen Media Research. Those companies have been together or separate, together or separate multiple times in the history of those companies. They come up with good reasons why they should be together and then for a while it doesn't work and they split it up or whatever. It's separate now. When I was there, it was together. They're interesting because, partly because of the capabilities they have. I'll tell you one of the things that's underestimated is how good the people were.
I'm going to use two or three examples here. One guy is, you will recognize his name but for the wrong reason. His name is Dave Calhoun. Dave Calhoun has been well, very visible for the last three or four years because he was on the board at Boeing after he left Nielsen and after he left Blackstone, he was on the board of Boeing when Boeing crashed two 737s. They asked him to get off the board and run the company, become the CEO. With the idea that all the problems that Boeing had would go away. Well, they didn't go away. Boeing's had a very, very difficult time and in my view, Dave's got too much of the share of the blame for what was really could be explained by a lot of things and a lot of people. He was the best CEO I've ever encountered in my career.
Dave was CEO when I was at Nielsen, and when I was at Nielsen, that was one of the periods of time when they were all together and they were making, creating real value from being together. I'll give you another example. The quality of some of the people that were there both what they did before, during, and after their Nielsen thing.
One of the best examples is Steve Hasker. Steve had been a very successful consultant at McKinsey. Dave Calhoun knew Steve, I think they worked together on some stuff. Dave said to Steve, "You know, I was very fortunate. I, Dave, was very fortunate that I learned a lot from Jack Welch when I was at General Electric. When I was running two different, two or three different businesses that Dave ran while I was there. I learned from him a lot." He says, "Steve, you've been a consultant at McKinsey helping people understand what it takes to run a company well. Would you actually like to do it? Because he says, I think I could teach you as much as Jack Welch taught me." That was, you have to jump at that opportunity. Steve did and he did really well there. Now, he's the CEO of Thomson Reuters. There was a lot of quality in people, but they also had a lot of opportunities and challenges, partly because they were in these different businesses, both businesses were changing, they had different cultures, very, very different cultures.
The media business and the advertising business and the marketing business, these are different animals and some of the people involved. The quality of the people and the resources that they had that they could be linking these databases made it a really interesting but also challenging business to be involved in. I was fortunate, I think, to be involved.
I came in to start a start a business. In these two businesses that they call watch and buy. Watch is the TV or internet or whatever, and buy is the UPC scanner data. There's two different businesses, separate sometimes, together sometimes, whatever. One of those businesses, the buy business, with UPC scanner data, had built over time a lot of good analytical services. The other had built none.
The media business was saying, "These are how many people are watching your show and that's pretty much it." I mean, I'm obviously exaggerating, but there was a very a total disconnect between the level of analysis and support of that sort on the buy business versus on the watch business. They asked me if I'd come in and build a business on the watch side. I said, "Yeah, I'll try." I found that there were three relatively small businesses that were going on just in a very custom basis. We pulled those together, we built some other capabilities, and built some standardized services. Some of them did pretty well. I was only there about three years. They have gone through one or more of these combinations versus separations since. I can't tell you exactly what the status of it is now. But as one more example of the quality of the people, I had the pleasure and the honor of starting the media analytics business for Nielsen.
My successor in that role as president of Nielsen Media Analytics was Karthik Rao. He's the CEO of Nielsen Media today. I was fortunate in the quality of the people that I got to interact with and the stuff that we did there and I have very good feelings about a company that's been complicated and as you said, a bit of a mystery to people off and on for many decades.
The Importance of People Above All Else
Andrew Mitrak: Something that's true that you've called back to a lot is that even as the technology changes at these various companies and latest and greatest in marketing research changes, having great people be leadership, be leaders at a company, that stays true as a constant of what makes a great marketing company. I'm wondering about if you think of the last several decades of your career and the time that we've covered on this, on our time together, there's been all this change in, we haven't even talked about the internet that much and all sorts of changes in the marketing and advertising and research landscape. I'm curious about what hasn't changed though. What's still true today that was true when you started your career?
Bill Moult: Well, the most important thing in my career is the people that I dealt with. I was very fortunate there. What hasn't changed is this is a business that's going to succeed or fail based on the quality of the people, the level of their commitment, their ability to work as a team. I think that remains true. The data's changed. A mathematician might say things haven't changed at all, that is you're all really crunching numbers in a different way, but they're all the same underlying things. But if you're looking for a constant, I would say, I'd choose the team every time.
Andrew Mitrak: I think that's a great message to wrap up on. Bill, it's been such a pleasure speaking with you. I really enjoyed it. Thanks so, thanks so much.
Bill Moult: As did I. Thank you so much. You take care. Good luck with this entire initiative.
Share this post