INTERVIEW

Beyond the Hype: The Real Impact of AI in Business

With Bob Cooper – ISBM Distinguished Research Fellow at Pennsylvania State University

SHARE

It’s worth noting that Bob is no newcomer to AI – his new product development model was augmented and validated using machine learning and neural network analysis over a decade ago.

Bob Cooper is an esteemed research fellow, professor emeritus, and Crawford Fellow with an impressive background. His focus has been on new product development, but what caught our eye were his recent peer-reviewed articles examining the impact of artificial intelligence on that process.

Unlike the unsubstantiated claims from many vendors, Bob’s research provides a treasure trove of insights. Over the past two years, he has published seven articles in peer-reviewed journals, exploring how firms in the US and Germany are adopting AI for new product development and the resulting impact on performance.

This kind of rigorous, research-backed perspective is invaluable. Many of the real-world successes of AI are happening not in flashy consumer applications, but in more industrial and back-office settings – step-function improvements that may not make headlines, but are quite measurable. And Bob’s work highlights a crucial point – homegrown AI initiatives often struggle, while off-the-shelf solutions tend to be more effective. The moral may be to be wary of those selling tools for your own “gold mining” AI expedition, when leveraging existing applications may be a far better use of resources.

Richard Owen
Well, Bob, thanks for joining us today. We really appreciate you taking the time to join the CX Iconoclast podcast. Thank you.

Bob Cooper
Great to be here with you, Richard.

Richard Owen
So you just shared a recent article that you wrote, whose title is pretty self-explanatory, Why AI Projects Fail, Lessons from New Product Development. And it’s a common stat quoted that very high percentages, maybe 80 % of AI projects fail. Perhaps we could just start by giving the highlights of that. If you were to summarize your key points from that article, what would you tell people?

Bob Cooper
Okay Richard Owen the article’s available online from the publisher in the IEEE magazine Engineering Management Review, and elsewhere. The key points are that when we did the complete search on this and not blogs, research, checking out the research, the real stuff, the hard facts, we found that about seven main reasons and the top three, and they’re almost tied and they’re no surprise is number one, the solution didn’t work so well. It didn’t work as expected. Well, hello. I mean, a lot of people are buying software from vendors that are being promised a lot of things by the sales people and guess what sometimes they don’t deliver. Either that or some other technical problems because the product does not work. So there’s a lot of expectations being set by the marketing side of software companies if you’ve gone online, I’m sure you’ve noticed that.

There’s also technical reasons for the product not performing too. But that was one of the top ones. Another one was data quality. And that’s data both in terms of training and testing the model, in other words, setting it up, especially if you’re using a machine learning type product that requires training. And also once in production or in operation having bad data. Garbage in, garbage out.

And I guess one of the implications is that you need a real professional data scientist who understands data sources, validity, reliability, merging data and so on and making sure it’s clean and reliable. And the third reason in the top three is they didn’t understand the user needs. In other words, they designed the product without really understanding what people wanted or they bought a product without understanding what people wanted. So those are the top three. There’s a few others down the list, but we can get into them if time permits.

Richard Owen
So let me make perhaps a controversial statement. It might not be controversial for you. You might be in violent agreement. So in every technology wave, certainly I’ve been part of, the first vendors out of the gate are the people who sell the picks and shovels. They essentially say, here are the toolkits required. And we saw this with RDBMSs in the 1980s. And we saw this with web app development in the late 90s, early 2000s. And they come out of the gate with a technology set and they sell it to corporations. And the message is, you you can do it. Here’s, you can find gold in the hills. Here’s the pick, here’s the shovel, here’s a of jeans. Go looking for it. And there’s a lot of money made selling these kits to companies. But usually when we look back on these technology waves, they’re rarely the applications that ever make it to full production. They’re usually the ones that fail. Is that something you agree with? Is that a fair observation?

Bob Cooper
I think that’s a reasonable observation. I’ve certainly observed that myself. Now don’t have scientific fact to back that up but as a casual observation, yeah. I mean I can remember when laptops came out going into the shop with my wife and they were saying well these are fantastic for keeping recipes. Okay, you know and that was sort of about the only application they could understand. Well that didn’t sell too well obviously.

But you’re quite right. The problem is that I think in most waves, really this wave is like many, it’s going up and then gonna come down a little bit and then it’s gonna swing back up heavily. So we’re sort of in a pre -wave of the main wave. And in the pre -wave, it’s sort of a period of great speculation, of great enthusiasm, of great salesmanship.

And a lot of companies are making money out of this. Obviously the tech companies, the suppliers. In fact, there’s a phrase going around, the only people making money out of this are the suppliers, not the users. So we’re sort of in that period of euphoria right now,

Richard Owen
Well, that would seem to be true. I mean if you look at the stack, Nvidia certainly benefited enormously from the growth. You could argue that AWS and Azure have done really well. But at the end of the day, the people who are supposed to be deploying it don’t seem to have as yet seen the productivity. There was actually a piece published by Goldman Sachs, largely looking at this from a stock market perspective and asking the question, well, if ultimately companies don’t get enormous benefits, then there’s a problem because all the capital investments in the technology aren’t going to yield much if businesses can’t get value.

LISTEN

WATCH

There’s also technical reasons for the product not performing too. But that was one of the top ones. Another one was data quality. And that’s data both in terms of training and testing the model, in other words, setting it up, especially if you’re using a machine learning type product that requires training. And also once in production or in operation having bad data. Garbage in, garbage out.

Bob Cooper

ISBM Distinguished Research Fellow at Pennsylvania State University

MOST POPULAR

Digital First, People Second?

Joe Wheeler

..

B2B Marketing Gets Intensely Personal

Ralph Oliva & Liam Fahey

..

Humor as Tool for Reflection and Change in Business

Tom Fishburne

..

Customers as the Best Guide for Innovation and Change

David Tudehope

..

Customer Experience Should Be Seen as the Means, Not the End

Tom Monahan

Bob Cooper
Well, you know you make a very interesting point, Richard. some companies are doing darn well. For example, I’ve I’ve investigated companies like Nestle in Switzerland and out of Lausanne. They are, according to their CTO, and I read a speech he gave at his internal meeting recently, he claims that there is a 60 % increase in the pace of innovation at Nestlé as a result of using AI. Everything from idea generation, concept generation, concept testing, doing the scientific research on the liquids or the food products, et cetera, mining their technical database. I mean, Vevey is their headquarters. It’s also their technical center. It’s like a university there if you’ve ever been on their campus. Enormous amount of knowledge that they’re mining. GE in the United States is making claims that they can cut the design of a turbine blades, one of the most tricky parts of a turbine, a jet turbine, is down by 50%. Renault just designed a new automatic manual transmission. So that sounds like an oxymoron, but this is an automatic transmission that feels like a sports car, if you can believe. Using AI and did it in about a third of the time using Siemens software out of Germany.

And Siemens is doing, so some smart companies are getting it right. And really, really getting dramatic improvements. And these guys typically are the more clever companies, the ones with deeper pockets and the ones with good IT departments and a lot of good experience in terms of program management, know, change management programs, et cetera. The average company is shooting themselves in the foot. And that’s sad.

That’s really sad because it’s obviously got to be mainstream.

Richard Owen
Well, and one of the points you made, which resonated very strongly with me, was this issue, and you mentioned it early on, was an inability to map it to ultimate objectives or needs. In other words, the tendency to think of this as a solution looking for a problem. A lot of board level conversation has said, well, we’re being told that AI, perhaps in particular generative AI, which in of itself could be a bit of a rat hole for companies, is going to be transformative for the economy. Heck, even the former British prime minister, Tony Blair, has been advising that AI needs to transform the productivity of the entire economy of the United Kingdom. And so you better get about it as a board, right? You better go off and find some things to apply generative AI to. And the tendency then is to rush off and, as you said, buy a bunch of technologies, put a team together, and go and start building things without necessarily thinking through at the end of the day, whether that’s a smart idea or what the end point for all this is in terms of its business impact. And maybe that separates in no small part the average, as you describe, company, which is not getting much out of it, and the outliers who in fact approach this from a much more thoughtful perspective than, we just got to do something because this is transformative. At the end of the day whether that’s a smart idea or what the end point for this is in terms of its business impact, and maybe that separates in no small part the average as you described of the company which is not getting much out of it and the outliers who fact approach this from a much more thoughtful perspective, then are we just going to do something that’s transformative.

Bob Cooper
Right on Richard. One of the phenomena we see is what some experts have called the shiny things disease. You know where a technology push wave is often driven by golly gee whiz isn’t this a neat technology where can we use it. Solutions looking for problems and what is missing and but by the way that went in the early days of product development, and I was around in the early days, that was the number one reason for new product failures that technical departments pushing technologies pushing solutions looking for problems. That was the number one reason for new product failure. Now product developers over the last 30 -40 years have gotten a little bit more intelligent and they’ve started doing something called voice of customer research and market research and understanding customers’ pain points. These are very common to people of the next generation but back in the old days they weren’t. We’re finding out you got to do the same thing when it comes to AI installations.

It’s not voice of customer now, it’s called voice of process and voice of business. Voice of process is more metrics that are focused on the particular process that the AI will be installed in. And also voice of business means literally sitting down and interviewing potential users and understanding their concerns, their points of pain, what keeps them awake at night, and how you’re going to have to make the product in order to satisfy their needs and keep them happy. I mean, if they’re not happy, they’re not going to use it and you’re going have what is called pilot paralysis once you move into the pilot because it won’t move past there. A lot of projects never move past the pilot because the users just push back. Voice of process, voice of business, two fairly simple solutions.

Richard Owen
Well, and that leads me on to another point you made here, which was unrealistic expectations. And to some extent, it’s the flip side of a coin, which is almost hubris. I think that companies start from a perspective, and I understand this, by the way, from the perspective of internal politics. If you’re on a data science team or you’re on an IT team today, it’d be good on your resume to have a whole bunch of IT projects.

Your tendency is to say, yes, we can do this. We can build this ourselves. We have the knowledge. We have the in-house expertise. By the way, we also have, to your point earlier, we have a vendor here who’s going to sell us some picks and shovels. And they are absolutely confident there’s gold in the hills. And there’s no risk if you’re not finding any. And so the expectations get tied to this almost hubristic approach that says, despite the odds, we’re the ones that are going to beat it. And a good deal of self -interest all gets put into overselling a capability, which is going to result in a lot of disappointment when more modest achievements might have been practical or frankly, the whole idea was perhaps out of whack from the get go. More modest achievements might have been practical or frankly the whole idea was outside of that.

Bob Cooper
I think human behavior is playing a major role here. I remember when I first started investigating why so many projects in business failed to achieve their objectives. Doing interviews in one large consumer products company in Ohio.

I won’t mention their name, but they’re in Cincinnati. The expression I heard from one executive was nobody ever got promoted by having their project canceled. And so the name of the game is get your darn project approved. That means showcasing your project. So when you stand in front of management and are seeking the bucks to move the dollars to move forward with it, obviously you paint your project in the best possible light. And to the extent the vendor has given you ammunition to do this, so much the better. So of course, expectations are way overstated. We found in traditional project management, like new product development, for example, typically by a factor of about 2.5. In other words, people would promise a million dollars benefit and actually deliver about 400 ,000. That was the average. And that’s a pretty large correction factor. I know what you’re going to say just divide by 2.5 but it doesn’t quite work that way. So it is an issue and it’s not not just an AI it’s been true historically.

Richard Owen
Well, was going to say, I mean this sounds like the history of IT projects. But as you said earlier, we’re going to go through a period of disappointment and waste largely because of, I think, lessons we’ve never learned from previous generations of technology. Because there’s nothing we’re saying here today that you couldn’t substitute for a whole series of technology, right? 10 years ago, 20 years ago. And then we’re going to see the success stories emerge. And to some extent, that’s going to come from best practices and standardization and people start to understand where the applications are. We were home growing CRM systems 20 years ago. And today you buy it off the shelf because at the end of the day, somebody solved all the hard problems already and we’ve learned how to do this. And surely we’re going to go through the same curve again.

Bob Cooper
I can see some glimmers of hope. I was working with a military provider, a technical component provider for both military and civilian aircraft. They were saying, you know, they’ve been playing around with AI for the last two, three years, largely no strategy, individuals in the company essentially getting a budget and going out and buying a piece of shiny software. Sort of, I guess, a piecemeal basis rather than more of a big picture strategic overview. Now they’ve started doing two things. Number one, a more strategic point of view, probably needed. Looking at the whole organization or the whole R and D function or whatever area you’re trying to introduce AI to. And the other thing they’ve started to do is they do have methodologies for doing projects involving technologies that have worked in the past. They’re starting to apply some of those modifying them, but starting to apply them because they do build in best practices and they do build in tough go -kill decision points along the way. And there’s a lot of tough go-kill decision points in an AI project, obviously, it’s not just an automatic goal from the get go. You you start in and you may find out bad things, partway along and have to pull the plug on it. So those are tough decisions. So they’re starting to use that modified to suit an internal AI project, but it is working, they’re saying And it’s about time they said they were using a good process, a good map. If you’re going on a long journey, as he said, you need a map. And we didn’t have a map.

Richard Owen
Yeah, and I’d like to pick up a couple of other points you made, because I thought, by the way, I think it’s a great article. Think to some degree, I think you’d say it’s a summary of the existing research that’s been out there, but brought together, I think, in a really succinct way of looking at these challenges. It should be required reading for anyone who’s about to embark on some home development exercise. I think let’s talk about talent because I draw together a couple of your observations, one of which was saying there’s a tendency to put teams together that are highly technical, data science, machine learning ops, IT resources, and lack any sense of domain. And if we think about it, most good solutions are going to come from a combination of deep domain expertise, whether it’s you’re trying to automate, as you said, you’re trying to improve turbine efficiency, you’re not telling me that no one on that team understands the engineering practicalities of building turbines, right? So you’re combining people with deep domain expertise with people who have the data science and perhaps the systems and tools perspective to be successful. And yet so many projects seem to start with it being originated out of a data science team. I wonder how much of that’s the mythology that data science is this incredibly scarce resource and somehow it’s all-powerful. We’ve tried to deify a little bit data scientists, that the mythology that data science is this incredibly scarce resource and somehow it’s all powerful. And we’ve tried to deify a little bit data scientists And it’s kind of led to an almost arrogance that says, you know, we can solve anything. And of course that’s a gross generalization, but it seems like where it goes wrong is when we overemphasize the data science and under-emphasize the domain knowledge.

Bob Cooper
Well, I’m not going to comment too much on the arrogance of the IT department because I get into trouble there, obviously. But this is not unusual. Again, the article was entitled ‘Why AI projects fail – Lessons from product development’, and product development has had the same problem. Whether it’s AI people with a scientific, with a data science background or a bunch of chemists at DuPont with a chemistry background or a bunch of folks at Hewlett Packard in Palo Alto with a strong electrical engineering computing background. These guys are somewhat gods.

I mean, Hewlett and Packard were gods. So, and Dr. Land who started Polaroid were godlike. And they sort of run the projects and to the detriment of the project itself. And, you know, the project leader is typically from that department, most of the people on the project team are from that department. And it’s only been in the last, at least in the field of innovation and product innovation, it’s only been in the last 20 years that people have really been saying put together a cross functional team, including people from, of course the scientific part of the business, but also people from marketing, from production, et cetera. So you get different inputs because all of those people have to be on board and have their input in order to make this thing work. So we’re starting to learn that in AI projects too.

It’s not a new concept. It does work. And as I say, lessons from product development was part of the message of this article. This is not a new concept, but it does work.

And there’s a lot of tough go-kill decision points in an AI project, obviously, it’s not just an automatic goal from the get go. You you start in and you may find out bad things, partway along and have to pull the plug on it.

Bob Cooper

ISBM Distinguished Research Fellow at Pennsylvania State University

Unlocking Growth and Retention in Manufacturing With Customer AI Analytics

Obstacles can seem endless. However, there’s a powerful tool that can help you navigate these challenges and transform your business.

=

Richard Owen
Did you see in your research any distinctions between companies who are focused on, let’s draw a distinction B to B versus B to C or more complex products? Because the examples you gave of success in some ways were heavily engineering-centered companies that have solved product problems for complex products often, right? But is there anything we can learn from looking at this through the lens of consumer markets versus business markets? Obviously you’re associated with ISBM, an organization I’ve had a lot of dealings with in the past and is a bit of a rare animal in that it focuses so much on B2B. Any thoughts on that?

Bob Cooper
Well, I am an engineer by training. And, although I am part of this ISBM institute for the study of business markets, a lot of my work over the years has been with consumer goods companies including the one I just referred to a few minutes ago in Cincinatti and a number of other like that. Unilever in the UK, etcetera. Sherwin Williams paint in the US, ICI the UK also paint, etc. Some of these companies, of course, are B2B and B2C. It is true that the WOW applications seem to be more of an engineering, mechanical engineering, electrical engineering type of design products. That, of course, could include some consumer products such as technical products. You know, an iPhone or electronic device. But if you’re thinking about food products and consumer packaged goods in the grocery store, not quite as much. The biggest, we’ve done another study, Richard, also through ISBM. We also did it in Ireland and we did it in Germany. And the interesting thing is that the applications, we looked at about 20 applications in the new product process from generating good ideas all the way down to the pre-launch product test to make sure they really work before we go to market. And the tasks that were AI enabled that had the strongest impact on key KPIs like acceleration, productivity, better decisions.

We’re all in the middle of the process, design, engineering, prototype building, prototype testing, virtual twins, for example, or digital twins, simulations. They seem to have the highest financial impact. Idea generation, yeah, that’s good, but it doesn’t seem to have quite the dollar impact or putting together a good business case using AI. know, the AI will help you build a business case. AI will help you design the launch of the product

Do the pricing, the advertising, et cetera. But those don’t have quite the same dramatic impact that the ones more in the middle of the process, the development and testing phases. And so logically you gravitate a little bit more away from, sorry, towards an engineering company like a Siemens and a bit away from a beer company like Guinness. Because I can’t quite see, you know, it’s a little harder to do AI for a Guinness type product.

Richard Owen
You know, it’s so so if you think about the pop, the current conception in the marketplace, almost all of it is consumer application, right? The the most exciting thing arguably to happen. Which, you know, has caught the attention of the market certainly has been generative and the most obviously cited use cases around generative are often personal use creative. And so we’ve we’ve got this public narrative that AI’s going to transform the world and it ranges anywhere from Elon Musk’s robo taxis and self-driving cars, all these very visible elements. And yet there’s an argument that that’s all sort of for show. It’s created a lot of attention, but the money’s being made by the quiet application of these technologies to solving complex problems that are often engineering related or product related or the back office or automation and productivity within the enterprise. That disconnect, by the way, might also start to be showing up in the stock market because that’s a lot slower to take shape. There’s sort of, I know there’s a golfing term, drive for show and putt for dough. Maurice who’s an active golfer will have more of a viewpoint on that. But are we doing that now with AI? Are we driving a lot for show?

Bob Cooper
Are we doing that now? Well, you know, you’re you ask good questions. Let me put this to you. This is an interesting issue because we have debated amongst ourselves groups of people that I’ve I’ve given seminars with about whether AI is going to be found more in the product or behind the scenes developing the product, because AI can be used in both ways. Let me give you an example.

I mentioned Guinness. Guinness is a company I worked with for many, years. I had a great time flying into Dublin all the time, St. James’s Gate, et cetera. Lovely city. You know, they had at the time a tap, a beer tap in every meeting room. Very civilized. In any event, in an industry like that, a process industry, whether it’s food and beverage or chemical industry or whatever, you can use AI a lot of the way along. For example, AI has been used very heavily in the pharmaceutical industry and to almost a greater extent in the chemical industry to come up with new molecules, new molecules that do this or that BASF and Frederikshavn is doing quite a bit of that in Germany. Coming up with AI to come up literally with new molecules. But they’re also using AI to come up with new ideas for a new product. I mean, if you don’t believe me, just sit down say, I am looking for a new beer idea. Just talk to your AI, to your chat. A new beer idea that is healthy, does not have a lot of alcohol in it, but sounds a lot of fun to drink. Can you give me some ideas for what I should develop or brew? And it’ll come up with some really good ideas.

So idea generation, concept generation, testing, putting together the launch plan, doing the mark on marketing communications. Yeah. But now how do we build AI into beer? I mean, I can’t really picture a pint of Guinness with little computer chips floating around in it telling me drink now it’s at the right temperature. Right. And that’s the way we sort of think about AI in the product.

And some of the products of course are silly, like Procter & Gamble’s new toothbrush that tells you it’s a smart toothbrush, AI built in, I guess it tells you whether you’re brushing your teeth correctly. So there’s lots of examples like that.

Richard Owen
Well, there’s a term for that, by the way. They call it AI washing, the tendency to take your vacuum cleaner and label it as AI -enabled vacuum cleaner or your toothbrush.

Bob Cooper
Yeah. Dyson has one out the UK company. It identifies allergens, I think they’re called and deliberately sucks these up.

Richard Owen
And you wouldn’t bet against James Dyson actually managing to incorporate AI into a vacuum cleaner and double the price for it and sell it to people. So maybe that’s the exception.

Bob Cooper
And Nike’s another one they have shoes that apparently lace themselves to the right level of tension so you’ll be a more effective basketball player. So is is AI gonna manifest itself more in the product or behind the scenes in the development of the product and I think in in certain industries like more mechanical electrical industries, it’s probably more behind the scenes, although right now I think the balance is about 50-50. That’s what folks are saying. It’s going to shift more to the product as people figure out. Now, I use the example of a beer. It’s hard to imagine AI in a beer until you start thinking, how about one of those labels? What is it they call it? An RIF tag? That maybe when you’re sitting in the pub drinking beer out of a bottle, for example,

It’ll allow you to play games on your iPhone or something or have an interesting challenge question for you.

Richard Owen
A nightmarish view of the future, a dystopian future. Sounds extremely plausible to me. But we’re kind of coming up against time, Bob. I wanted to sort of leave on one theme here, which was despite all this, I mean, it sounds to some degree as though you’re still an optimist here, that this is a natural process that companies are going to go through, and they’re going to emerge the other side and we’ll figure this out and we’ll move out of the sandbox phases into more mature development processes. Is that a fair characterization of where you came out on this?

Bob Cooper
I think so. If you get it right, the benefits are significant. Some companies have modeled the way to show that is true. As I said, early adopters with the deep pockets, and know how to do things right. For the rest of us, the great majority of us, it’s a matter of getting it right. And so many mistakes are being made at the moment. And interestingly enough, Richard, there was a Harvard Business Review article. They actually used the phrase “dumb reasons for failure” in the title, because most of them are very dumb reasons that can be avoided with a little bit of best practices built in.

Richard Owen
I couldn’t believe that they actually put that word dumb in the title.

Bob Cooper
Yeah, but and that’s basically what my article was outlining. Here are the reasons guys.

They’re all actionable. People have solved them before in other contexts. Learn from how they did it before and apply it to your particular endeavor. So I am very optimistic. The benefits are there if you get it right. The key is getting it right.

Richard Owen
Yes, when I read your piece, the thing that struck me was, again, there wasn’t anything there which I think you looked at and said, okay, this is revolutionary, but it’s a really great way of pulling these points together and backing them up, as I said, with the research, the hard research, as you said earlier, in a form which I think makes it a very compelling and straightforward read. So congratulations on that. Bob Cooper, thank you very much indeed for joining us today, we really enjoyed it and good luck with getting more visibility, especially to the piece. Hopefully people watching this will pick it up and read it and get some benefit from

Bob Cooper
Well, thank you, Richard, for allowing me the opportunity. And I wish everybody who’s attempting to move forward on this journey, this AI journey, farewell, literally do well.

If you get it right, the benefits are significant. Some companies have modeled the way to show that is true. As I said, early adopters with the deep pockets, and know how to do things right. For the rest of us, the great majority of us, it’s a matter of getting it right. And so many mistakes are being made at the moment.

Bob Cooper

ISBM Distinguished Research Fellow at Pennsylvania State University

ABOUT THE CX ICONOCLASTS

Dr. Robert G. Cooper is ISBM Distinguished Research Fellow at Pennsylvania State University’s Smeal College of Business Administration, Professor Emeritus at McMaster University’s DeGroote School of Business (Canada), and a Crawford Fellow of the Product Development and Management Association (PDMA).

Bob is the creator of the popular Stage-Gate® process model,  now the most popular idea-to-launch NPD process globally (for physical product firms). He also developed Stage-Gate-TD for internal technology projects, and co-developed the Agile-Stage-Gate process. In terms of project selection models, Cooper developed the original NewProd™ scoring model and the Value-Based Model for NPD projects.

More recently Bob has been heavily engaged in research in to the field of AI adoption for NPD by firms in the USA and Germany, and its impact on NPD performance. He has published seven articles on AI in NPD in peer-reviewed journals in 2023-2024.

Over his career, Bob has published 12 books – including the “bible for NPD”, Winning at New Products, and more than 160 articles on the management of new products, most in refereed journals. He is noted as being the #2 cited Marketing Professor globally, after Dr. Phil Kotler, by Resarch.com. He has won the IRI’s (Innovation Research Interchange) prestigious Maurice Holland Award three times for “best article of the year”. Bob has also helped hundreds of firms over the years implement best practices in product innovation, including companies such as 3M, BASF, Bosche, Danfoss, Dow Chem, DuPont, ExxonMobil, Glaxo-Smith-Kleine, Guinness, HP, ITT Industries, LEGO, and P&G.

Cooper holds Bachelor and Master’s degrees in Chemical Engineering from McGill University in Canada; and a PhD in Business and an MBA from Western University, Canada. Website: http://www.bobcooper.ca

FOLLOW BOB

Richard Owen is celebrated as a leading figure in the Customer Experience industry, primarily known for his contribution as CEO at Satmetrix, where he and his team, along with Fred Reichheld, developed the Net Promoter Score methodology, now the globally dominant approach to customer experience measurement. His efforts further extended to co-authoring “Answering the ultimate question” with Dr. Laura Brooks, establishing netpromoter.com, and initiating both the NPS Certification program and a successful conference series. Owen’s diverse 30-year career has seen him drive technology-led business transformations at Dell, lead software companies like AvantGo to a Nasdaq listing, and Satmetrix to acquisition by NICE Systems, while also engaging in venture investment and board roles. Today, he spearheads OCX Cognition, leveraging machine learning for real-time NPS and customer health analytics.

FOLLOW RICHARD

Questions?
success@ocxcognition.com

 

SOFTWARE