Read

It’s been called everything from the ‘too dangerous to release’ and the ‘Robot Apocalypse,’ to ‘overhyped,’ to ‘shockingly good.’ 

Originally developed by OpenAI, a research lab in San Francisco, GPT-3, the hot new language model, has gained significant traction as the hottest new language processing model in recent memory. 

An (admittedly) oversimplified explanation of GPT-3 is as follows: a language processing tool that can do everything from creating poetry, writing code, or answering questions about the world. When asked to produce a piece of information, GPT-3 taps into 175 billion parameters, or weights, when providing a response. 

(Yes, that’s 175 billion with a “B”.)

The expansive, if not vague, potential use cases here range across a wide spectrum. You could ask GPT-3 to write a poem about baseball, with a caveat that the prose should mimic Emily Dickinson’s style. Just as easily as asking GPT-3 to take a list of facts, and devise a legal complaint that looks no different than the typical document filed by a first-year lawyer. 

It’s ambitious. And like anything ambitious, there are plenty of possibilities and drawbacks to consider, so let’s scale the enormity of it all back, and focus on the following question: How will GPT-3 impact the B2B space?

Working in the data space as the first employee of ZoomInfo, then known as DiscoverOrg, I have seen the good, bad and ugly of harnessing the power of big data. Let’s examine 3 data-centered predictions from the perspective of a prospective buyeraspiring vendor, and B2B, at-large.

1) Buyer Beware: Cheap Imitations Are Likely To Flood The Market

The sad reality is that the world is getting flooded with small companies stretching the definition of AI-powered intelligence. And as with everything, the reasoning becomes clear the second you decide to follow the money. 

You want proof? I’ll give you proof: According to Brex, companies with “.ai” domains raise 3.5x more money than those with “.com”. This makes sense, investors are willing to take larger gambles on the new frontier. 

But here’s the problem, a study from 2019 found that 40% of AI startups don’t actually use AI! It also found that companies claiming to use AI receive 15-50% more funding compared to non-AI companies. 

It appears it’s happening in the B2B data world too. One provider has the top level domain “.ai”, and even has “AI” in their name. Why do I suspect this reliance on AI might be oversold? 

They have 85 employees on LinkedIn and they’ve never had a data scientist, machine learning engineer, or data engineer, according to LinkedIn. And their website is littered with references to artificial intelligence that generates “perfect” data. 

Here’s a little secret from an expert in the industry: it’s impossible to curate perfect B2B data. A company willing to stretch the truth on data quality likely has no qualms about embellishing the amount of AI under the hood.

Artificial intelligence is a buzzword in the software world, and let’s face it– it’s smart marketing to associate your brand with artificial intelligence. 

The problem with these marketing campaigns and the prevalence of AI-driven sales pitches is that they make it difficult to separate the pretenders from the legitimate companies leveraging A.I. 

2) Vendor Viewpoint: Beta Testing AI Is Iterative GPT-3

Low barriers to entry can cause a lot of problems evaluating products or services. But what about the value proposition of not only GPT-3, but AI in general. You can slice it or dice it either way, but let’s simplify: AI ingests, interprets, and analyzes real-time information to return value. 

Perhaps even more attractive is that AI is typically engineered to iterate, refine and improve its processing engines to constantly drive better performance overtime.

Yet, beta testing for B2B use cases may prove difficult, given one of the biggest problems evaluators are seeing is that GPT-3 always gives an answer. It’s like your friend who pretends to know everything, never says “I don’t know”, and always makes a guess. 

Look at these examples I pulled from Kevin Lacker’s blog:

Q: Who was president of the United States in 1600?

A: Queen Elizabeth I was president of the United States in 1600.

Q: Who won the World Series in 2023?

A: The New York Yankees won the World Series in 2023.

Other times it’s simply wrong.

Q: How many eyes does a spider have?

A: A spider has eight eyes.

Q: When counting, what number comes before 10,000?

A: Nine thousand and ninety-nine [9,099] comes before 10,000.

Some questions simply don’t have an answer, and some questions have an answer of “none.” While I haven’t yet been given access to GPT-3, I can only suspect from reading others’ tests that it would not handle the following questions well:

“What is the parent company for Microsoft?”

“How many employees work at Enron?”

These types of questions when posed to computers will always deliver nonsensical answers, or even worse, incorrect answers that appear to make sense. There will always be a limit to the types of things machines can do, at least over the next 10-15 years. This yields my next big prediction . . .

3) Marketplace Margins: Data Gap Hinders Innovation and Reliability

The ongoing arms race between B2B organizations to invest in the exploding landscape of sales and marketing technologies is nothing new. Think of all of these technologies as appliances in your house — the things you rely on to operate day-to-day. From your refrigerator to your phone charger. We use these things for a bunch of different shit, but we don’t consider the underlying electricity powering them…And too often, the lights go out…

Why? 

The same goes for even the most common applications adopted amongst go-to-market teams, like CRMs, Sales Automation, and Marketing Automation, Retargeting Platforms. All of them are only as effective as the data (errr electricity) housed within or piped into their applications. Granted, you won’t see that disclaimer in the fine print of the contract. But the bottom line is that a garbage in, garbage out mentality towards data management submarines the ability for technologies to deliver on even the most basic processes (and value propositions). 

The electricity GPT-3’s data processing logic is (most likely) exponentially more complicated. In other words, instead of electricity to power a traditional colonial, we need to power a smart house. 

Now, it’s easy to balk at the analogy. I’ll even concede that many people don’t realize it’s fairly easy to build a static database of company information. And that’s fine. But it’s the maintenance, the cleansing, the merging, the splitting – that makes it hard. Data doesn’t typically work with you as it ages, it works against you. 

I suspect we’ll likely see a rush of lean start-ups trying to make it in the data world, they’ll pat themselves on the back for scraping some data, only to be stifled by the complexities of the maintenance of B2B data. Less accurate solutions will hit the market. Some buyers may be fooled by clever showmanship.

Where Do We Go From Here?

I believe we’re still a long ways away from putting the innovation made possible by something like GPT-3 to practice in the B2B space. Then again, advances have been made across the board in AI-use B2B cases in recent years, which is why more companies are investing in data science tools, resources and teams. And that’s a win for everyone.

About the author

Derek Smith

Derek is the Senior Vice President of Innovation and Data R&D at ZoomInfo, the leading B2B contact database solution and sales intelligence platform.

Subscribe to the ZoomInfo blog.

B2B marketing, growth, sales and more.


    Related Content