Sacra Logo Sign In

Dave Rogenmoser, CEO and co-founder of Jasper, on the generative AI opportunity

Rohit Kaul
None

Background

Dave is the CEO and co-founder of Jasper, an AI-assisted copywriting app. We spoke to Dave about Jasper's growth, opportunities in generative AI, competition from large tech incumbents and TAM expansion.

Questions

  1. Can you give us a brief history of the evolution of generative AI tools over the last few years and why they have suddenly become popular in 2022?
  2. Within six months of launch, your ARR crossed $15M. Could you help us understand where you found early traction, what customer segments and use cases saw rapid adoption, and what your core growth loop looked like?
  3. Did the specificity of your early positioning as a tool for writing Facebook/Google ads help in getting early people attracted to Jasper?
  4. What is your core customer profile and how has it trended over the last few months?
  5. How does Jasper use GPT-3 foundational model? Do you only use vanilla GPT-3 or build your own fine-tuned models on top? Are there ways in which you are generating content differently from other apps out there?
  6. How do you collect training data from user engagement signals and how do you use it for fine tuning your models?
  7. Does the model fine tuning happen on OpenAI or your infrastructure? Who owns the fine tuned models? Are they proprietary to you?
  8. When users generate content, does some of it come from GPT-3 models and others from Jasper’s fine-tuned models? Is it even the right way to think about it?
  9. AI foundational model companies like Midjourney mention that 10% of their cost goes to training and 90% to inference. As an application that sits on top of a foundational model, how is your cost structure different from them and what are the key components?
  10. Some critics say that with LLMs like GPT-3, trained on terabytes of data, fine-tuned models will take a long time to add value to users, if at all. Is their understanding of model fine tuning correct?
  11. The AI-assisted content generation market is rapidly getting crowded with companies like Copy.AI, Rytr, Writesonic, Writer, and Peppertype.ai. How is Jasper positioned vis-à-vis these other players, and where do you see differentiation coming from?
  12. Do you see Jasper as an AI company or a sales/marketing company that happens to use AI?
  13. With OpenAI and others making it cheap and friction-free to build AI applications, it’s a common refrain that all incumbents like Microsoft, Google, and Canva need to do is add AI to their existing apps to win this space. What do you think about the competitive threat posed by these kinds of incumbents?
  14. Jasper recently released a Chrome extension. Do you see Jasper evolving into a standalone app like Canva/Figma or more of a utility embedded across user workflows like Grammarly?
  15. There are different generative AI apps tackling different types of content across text, image, audio, and video. Do you see these boundaries blurring, or is there a constraining function that prevents apps from bundling all of them together? Do you see Jasper evolving to cover more categories beyond text and image?
  16. Are you seeing any surprising changes in how folks use Jasper via the Chrome extension?
  17. Some critics believe that the generative AI apps are creating generic, low-quality content which is not close to what humans can create? What do these critics get wrong?
  18. Emad Mostaque, the CEO of Stability AI, mentioned it took him less than $600k to train Stable Diffusion. As the cost of compute falls, what are the implications for a company like Jasper? Will we see more companies building self-trained models and shifting away from foundation models?
  19. As the cost of compute falls, do the foundational model companies like OpenAI, Stability AI and others also face the risk of commoditization? How do open source models change the dynamics in the foundational model layer?
  20. What does Jasper’s margin profile look like at scale? Are there scenarios in which scale improves the margins and unit economics?
  21. What does Jasper become in the next five years if everything you're doing goes right?

Interview

Can you give us a brief history of the evolution of generative AI tools over the last few years and why they have suddenly become popular in 2022?

I first got interested in generative AI when GPT-3 came out in summer of 2020. I couldn't get access to it at the time, but that was the first time it felt like the technology in terms of output quality had crossed some threshold to where it was really useful and usable.

That, combined with the API model that OpenAI chose to go with, allowed a lot more people to use it—instead of having to spin up your own servers.

That was a big breakthrough in terms of tech. And then Stability.AI rolling out an open source text-to-image was really what felt like it opened up the floodgates. It wasn't all consolidated to OpenAI or one company. Now, everyone could play with this and iterate on it.

Now, it just seems like the quality of the output has actually gotten useful. A combination of smart people having breakthroughs, computing costs coming down a bit, and access to GPUs that weren't available before made this a perfect storm.

Within six months of launch, your ARR crossed $15M. Could you help us understand where you found early traction, what customer segments and use cases saw rapid adoption, and what your core growth loop looked like?

Yeah, when we launched, we'd already been working. I've got two co-founders I’ve been working with for eight years now. We already had a team and we already had a market—we came from a marketing background and knew exactly what marketers would want. Plus we had built up a pretty strong community of marketers already, so we had a bit of a head start in a lot of ways there to just hit the ground running hard.

When we launched it, we thought Jasper was a Facebook ad copy tool and that was how we positioned it. It was going to help you write Facebook ads and Google ads and maybe your landing pages, but it was more for ad copy. The company at the time was called conversion.ai, and it was much more of a conversion focus—making your ads convert better.

We haven't done anything fancy growth wise. We've just executed on a couple channels and a couple strategies really, really well and made really strategic hiring decisions along the way.  

Early on, we invested heavily in the community because we just thought that there's so much that you can do with these tools that we've got to train people—and we could never train people ourselves all the way. This community turned into so much more than just a more efficient way to train people on the product, and now this community has very real input into the direction of the product and company. 

It's also a very word-of-mouth product in and of itself—everybody that sees it is awed and often they go to tell somebody about it the next day. It’s community, it's word of mouth, enabling skilled marketers to work their magic, and really strong product market fit. —Those are really the big areas we focused on and we were just really, really aggressive in all of those.

Did the specificity of your early positioning as a tool for writing Facebook/Google ads help in getting early people attracted to Jasper?

Yeah, that was our wedge into the market—saying, "Hey, we're going to be really, really good and really focused at direct response copywriting." 

I'd had the background in that so I could speak from authority. We had very high quality outputs because we were training the models and prompting the models with really great examples. Because I was a marketer, I just knew what a really high bar would look like.

We were pulled pretty quickly into much longer form content. We were asking our users, "Hey, what do you want us to do here?" And they're like, "We want to write blog posts." So we quickly built out more of a UI that looks kind of like Google Docs where you can go and generate longer blog posts, longer emails, longer social media posts and stuff like that. That's our primary use case now. Probably 60 or 70% of our users use Jasper to write blog posts. 

What is your core customer profile and how has it trended over the last few months?

It’s generally been people wanting to scale their marketing. Some of that is marketing teams—at medium size or large companies—and some of it is freelancers. Some of it's just marketing generalists at small companies doing it for themselves there.

We’re definitely skewed much more to SMBs and prosumers, though as we’ve become more well known, we're seeing a ton of traction from  bigger companies that want to do this and want to adopt it too. Our plan is to expand to meet the needs of bigger companies without losing our appeal to freelancers and prosumers, because that’s our crew and our community, and they really need Jasper and love it.

How does Jasper use GPT-3 foundational model? Do you only use vanilla GPT-3 or build your own fine-tuned models on top? Are there ways in which you are generating content differently from other apps out there?

When we first started, we just used vanilla GPT-3. It’s a misconception that everyone that uses a specific  model creates the exact same output. 

Maybe 20 to 30% of the output quality can be affected and added to on top of the base model through prompt engineering, the different settings, and so on. So even out of the gate, before we were doing anything fancy, we were consistently getting better quality than all of our competitors just based on knowing marketing and having a use case and knowing how to prompt our model and give it great examples.

Output quality is really what we sell. It’s our northstar and we are relentless in being best in class for that.  We want to have the highest quality output of anybody. For me, I'll use any tool or any model available to do that. 

Some of our outputs will be fine tuned models based on our own proprietary data that learns from our customers. We start to get a little bit of a data flywheel there that helps you just generate better and better models. 

But the fine tuning doesn't always work—it’s not a given that you can just train up a model and it'll be better than a base model—so it really depends.

We’ve also started to experiment and use open source or our own models in various places in the app. Again, there's not a one-size-fits-all. It depends on the use case, it depends on the template, and it depends on a lot of other factors.

I think some people get nervous or they think, "Oh, how can you even build a business if you're using the same kind of foundation models as everybody else?" there's lots of areas to differentiate and if you just stay focused on the core customer it becomes much clearer what you need to build either in the AI or outside of it.

How do you collect training data from user engagement signals and how do you use it for fine tuning your models?

We've got 50+ templates that our customers use. We allow users to rate their outputs which gives us visibility into what’s working and what’s not. We'll take out as many as we can and then go train the model and say, “Hey, we want a separate model that's still kind of based off the same foundation, but we want it to be more fit to this because this is what our users want.” Then, that model will come out higher quality. Do that enough times and you really start to get a higher quality model. 

Does the model fine tuning happen on OpenAI or your infrastructure? Who owns the fine tuned models? Are they proprietary to you?

For us, it’s still hosted by by OpenAI. They’re still running it. They've built an interface to be able to do that without us having to host it all ourselves there.

But what we’ve done is proprietary— their core model is not getting trained on our data. What’s there is a unique model that only we have. There’s constantly new models from multiple R+D partners that each have their own merits, so it’s up to us to have a cutting edge team that’s able to identify the models we want to adapt and build upon them to create the best possible outputs.

It’s a fast moving thing, and we try a lot of stuff—but it's always hard to predict what's going to work well.

When users generate content, does some of it come from GPT-3 models and others from Jasper’s fine-tuned models? Is it even the right way to think about it?

Every different action in our product is hitting a different model. Some of them are shared models, but each one is going to be hooked up to a particular model that's going to go and actually receive the input and generate the output there. Some of them run through multiple models in a given run. 

We might do something first to clean it up or augment it and then go to GPT-3 next, or the reverse—it might be GPT-3 first, and then we clean it up with some of our own models. Generally, though, they're hitting either a foundation model or a fine-tuned model with any given action inside the product.

AI foundational model companies like Midjourney mention that 10% of their cost goes to training and 90% to inference. As an application that sits on top of a foundational model, how is your cost structure different from them and what are the key components?

I don't actually know what the split would be from that compared to our own training, our own hosted models, and things like that—but again, it's going to be so different than Midjourney because they're actually doing a lot of the training themselves more than us.

I think of them as a layer one. We do a little bit of that, but mostly we're kind of layer two. We're saying, “Hey, we're going to sit on top of these things, use them, and then just build the product for the end user there.” Midjourney is more like an OpenAI in my mind.

Some critics say that with LLMs like GPT-3, trained on terabytes of data, fine-tuned models will take a long time to add value to users, if at all. Is their understanding of model fine tuning correct?

Today, you’re seeing more conversations around whether we can do mostly the same as what we’re doing now with a model that's maybe 10% the size of GPT-3. What we’ve found is that yes—under specific situations, and perhaps with a narrower set of use cases—you don't need these huge models. 

They’re expensive to use, they're expensive to work with, they're hard to train, and if you can do what you want to do with a smaller model, that’s going to be much better. 

I can't definitely say that models are going to start getting smaller. Models will probably still continue to get bigger as we see what they can really do, but it's definitely not a given that bigger is better. By large, if we can do it with a smaller model, we're going to do that. But obviously, GPT-3 hit something with a big model that works really, really well.

The AI-assisted content generation market is rapidly getting crowded with companies like Copy.AI, Rytr, Writesonic, Writer, and Peppertype.ai. How is Jasper positioned vis-à-vis these other players, and where do you see differentiation coming from?

We are heading for a time when artificial intelligence is widely available. But being widely available and actually usable to achieve business outcomes are two very different things. Jasper’s focus is on developing the best application layer for businesses and on building intelligent interoperability among models.

At the end of the day, we're a product company. We live and die off our product being really great. And we measure that greatness on how often a user gets a quality output from the product.

That being said, there's so many other pieces of growing a great business. We've invested heavily in our community. We've differentiated our brand. Obviously we’ve really nailed our go-to-market and just focused on a few channels.

AI is important and there's differentiation there for sure, but what’s really powerful is making sure the UX and the workflows are all really aligned with the community, persona, and the go-to-market. We're not just saying, “Here's how to use the tool.” We're saying for example in the marketing use case, “Here's how to write great blog posts that rank.”

Do you see Jasper as an AI company or a sales/marketing company that happens to use AI?

We talk about it as just solving a problem. I didn't come into this thinking, "Oh, I really want to use AI." I'd seen AI, I'd seen GPT-3, and I thought, “Oh my gosh, that really solves a problem I know exists and that's 10 times better than the way that problem is being solved now.” 

We approach it as, "Hey, our customers have this problem where they need to write lots of great high quality content for their readers and they're overwhelmed and they don't have enough resources to do that—how can we solve that?"

Some of that you solve with great UX, some of it you solve with great workflows, some of it you solve with great integrations, some of it you solve with great AI. All of those combined roll up into hopefully being a solution that's better than anything else on the market.

With OpenAI and others making it cheap and friction-free to build AI applications, it’s a common refrain that all incumbents like Microsoft, Google, and Canva need to do is add AI to their existing apps to win this space. What do you think about the competitive threat posed by these kinds of incumbents?

For Jasper, we see ourselves as the future central content brain that every person at your whole company sits on top of. The models will be trained to each person’s tone of voice, how you want to speak, your product catalogs, and all of that, no matter where you create content on the internet.

That tool will hook into every other tool that you're using, so even if Google Docs has a great generative tool—even if Microsoft has a great one—it's not going to be as good an experience for a marketer or a salesperson who needs to build content everywhere. You’re going to go into Google Docs and you’ll have one tone of voice, and then you log out and you go into Facebook and you try to write your ads, and that'll be in a different tone of voice.

What people are going to want is this unified experience that speaks how they speak everywhere, and that's really where we see Jasper fitting in—Jasper will be the tool that bridges all of those and is personalized to your company.

Jasper recently released a Chrome extension. Do you see Jasper evolving into a standalone app like Canva/Figma or more of a utility embedded across user workflows like Grammarly?

Historically, yeah, Jasper's been a web app where you log in, you write your content, and then you copy and paste it back out. That really is not an ideal experience. We want to be everywhere that you're writing content, so as we grow, people will use the web app less and less often. 

By and large, Jasper should already be inside your Facebook, or your Google Docs, and be helping you write—and an easy first step there is to have a Chrome extension that just kind of instantly supercharges any text you’re writing on the internet as long as you're using Chrome.

We also will have deep integrations into these other tools to the degree that they let us. We'd love to have a deep integration into Canva. We'd love to have a deep integration into HubSpot. We’d love to have a deep integration into Webflow and into these other different tools where you can just toggle Jasper on and use it really, really powerfully inside them.

The future state of Jasper is one where probably 95% of usage is happening inside of other apps or even on your phone.

There are different generative AI apps tackling different types of content across text, image, audio, and video. Do you see these boundaries blurring, or is there a constraining function that prevents apps from bundling all of them together? Do you see Jasper evolving to cover more categories beyond text and image?

Jasper is poised  to serve sales teams, customer support teams, and all other teams that rely on content. I want everyone at every company to use Jasper. Right now, we're really focused on marketing to get a strong wedge into the market, but we need to branch out from there. 

On the mediums, mainly I think we’ll let users help us decide. That’s Jasper’s superpower - our community has very real input into the direction the product takes which mitigates alot of risk and guesswork. I never want to put bragging rights on having the most features within an AI content tool ahead of the improvments my customers are actually asking for. 

We certainly need to stay focused. There's real risk in trying to be too broad, and you end up building a tool that doesn't really solve anyone's problem deeply—it just kind of solves everyone's problem a little bit.

Are you seeing any surprising changes in how folks use Jasper via the Chrome extension?

It’s still early days, but the Chrome extension is something our customer love. Right now Jasper's something you have to remember to go log into, but if you just see the little Jasper icon throughout your work, you're more likely to use it. So yeah, we're already seeing this is going to be really good—I could really see the majority of our usage switching to the Chrome extension relatively quickly as we keep working on that. 

Some critics believe that the generative AI apps are creating generic, low-quality content which is not close to what humans can create? What do these critics get wrong?

I think right now, you can create really, really high quality stuff using Jasper. Really high quality content. But we always say you've got to play a role in editing it. Jasper left to himself is probably not going to come up with expert human level content. We see Jasper as a tool for writers, not a replacement. Creators using Jasper can get more time back for research, developing angles, and infusing lived experience into their content.

What’s interesting is that while Jasper doesn’t have emotions, it’s trained on data that lets it write stuff that looks very dramatic and is very in-tune with the emotions the writer wants to evoke. Jasper can very closely imitate people in that kind of style.

A year from now, two years from now, text generation's going to be five or 10 times better than it is now. I think people really underestimate how fast this is all going to keep getting better.

Emad Mostaque, the CEO of Stability AI, mentioned it took him less than $600k to train Stable Diffusion. As the cost of compute falls, what are the implications for a company like Jasper? Will we see more companies building self-trained models and shifting away from foundation models?

I think it's both. The foundation level will definitely fragment—they’ll have a lot of competition and a lot of people with different breakthroughs. I think the foundational model layer will commoditize first and more quickly than the app layer.

That doesn’t mean there won’t still be a few huge businesses—just look at cloud computing, which is highly commoditized. AWS is still massive. What it won’t be is a gold rush around which model you can get your hands on and all of that. People will just have access to a bunch of good models, and it'll put far more pressure on them executing to do something useful with it.

It’s like with AWS—it doesn't really matter which one you use. You just pick one and go. It's more about what you’re going to build for customers now that you have that power. That’s where the execution is going to need to happen.

As the cost of compute falls, do the foundational model companies like OpenAI, Stability AI and others also face the risk of commoditization? How do open source models change the dynamics in the foundational model layer?

That’s not to say that the app layer is totally safe. All of this gets commoditized over time. Especially the ones that don't continue to keep building real value at the app layer. But if you build a really great tool that solves a really big problem using AI, you're going to be in good shape.

What does Jasper’s margin profile look like at scale? Are there scenarios in which scale improves the margins and unit economics?

We've basically got normal SaaS margins right now, they’re strong. At scale, again, pricing pressure goes both ways. Our COGS gets cheaper and cheaper as the foundational layer gets more commoditized. At the same time, you see perhaps more pricing pressure in what we can charge customers. All of that's obviously good for the end user. 

What does Jasper become in the next five years if everything you're doing goes right?

We see Jasper as being the tool that every person at every company is using to generate great, high quality content. Right now a lot of these generative AI tools—perhaps fairly—have received a rep for low quality. And we have no intentions of replacing anybody that's a skilled writer, but what we eventually want this to be is something that creates content everyone is proud of—that companies feel good putting their name behind.

In this space, we think Jasper could be a hundred billion dollar company and really a behemoth of this new way of thinking about software. The market is massive, and as long as we don't screw up and we keep executing well, we think there's a pretty clear path to get there.

Disclaimers

This transcript is for information purposes only and does not constitute advice of any type or trade recommendation and should not form the basis of any investment decision. Sacra accepts no liability for the transcript or for any errors, omissions or inaccuracies in respect of it. The views of the experts expressed in the transcript are those of the experts and they are not endorsed by, nor do they represent the opinion of Sacra. Sacra reserves all copyright, intellectual property rights in the transcript. Any modification, copying, displaying, distributing, transmitting, publishing, licensing, creating derivative works from, or selling any transcript is strictly prohibited.

Read more from

Read more from

Harvey revenue, growth, and valuation

lightningbolt_icon Unlocked Report
Continue Reading

Jeff Tang, CEO of Athens Research, on Pinecone and the AI stack

lightningbolt_icon Unlocked Report
Continue Reading
None