Man you guys are pessimistic and stuck in your bubbles. Asking if box is relevant when they’ve had growth the last two quarters while Dropbox is on the verge of seeing a drop in numbers. Yes ms and Google are seeing more but the product box is putting out is light years ahead of Dropbox, egnyte, and, honesty, most of FAANG, in terms of supporting the average office worker. They aren’t building sexy AI coders but they are making it so Sue Ellen, the EA to the VP of fuckall, can automate things and make life easier for everyone.
I worked at box, albeit for only a year, and saw Aaron’s passion up close. He could step into an executive briefing center meeting and pull out projects the customer was working on a year prior that didn’t directly involve box and ask honest, empathetic questions about it.
He’s a nerd and loves talking about this stuff. Those saying he’s trying to remain relevant or find a new hole for box miss the fact he’s on cnbc and other business channels a few times a year just talking tech. Unless Houston who only shows up when Dropbox has a sad launch to tout.
Aaron is passionate about Box and has been fighting an uphill battle since day one but I’ve held on to my thousands of shares since I left and the value has doubled. I fully expect AI to accelerate their future and lead to a bigger balance sheet.
Aaron Levie strikes me as one of the sharpest and most prescient people in SV. I think you must be 100% correct that he's a true nerd about this stuff, or else he could have gone on to a thousand different things outside of Box by now.. he would probably be an incredible VC (I think I recall he was one of the first seed investors in Stripe as a result of cornering Patrick Collison at a house party, talking his ear off, and thoroughly impressing Collison).
Part of the reason you're seeing pessimism about box is because the stock has barely kept up with inflation. Their IPO price was 23.23 USD, now it's at 32.30. That's 39% of growth while inflation over 10 years is 36%.
So sure their CEO is passionate and on CNBC but the business is clearly not doing that great
The claim that companies won't just build everything in-house because of wanting to be able to sue over damages seems extremely weak. If AI tools could genuinely build a replacement for Box with trivial effort, everyone would drop them immediately.
There's some strangeness around AI where people will claim it will do amazing things, but somehow not so amazing that no one will need them.
You miss the bigger point about context. Companies pick their focus and what to spend their time on. Post startup-scale, the cost of internal failures becomes exponential. AI might be able to whip up the exact same legal document or CRM or whatever, but when one thing goes wrong, that might lose $1 million in revenue. It is about paying for accountability over outcomes.
Do some math to see if it makes sense. If you have 10,000 employees, any software subscription that's $100 a person for all employees will be a million dollars a year. Many licenses are more than that. Even if you experience the million dollar outage, you'll have easily saved money.
I see some comments here that are critical of Aaron, but I enjoyed the talk. Pretty good insider info on trends in the "enterprise" vis-a-vis ai and saas in general.
Especially liked the framing of structured vs unstructured data, and the opportunity to make unstructured data more useful with llmns.
I do not have much criticize or add except to emphasize that ai as it is now is probabilistic, not deterministic -- ask the same question twice and you could get vastly different answers.
I remarked on this back in March, prompted by Théophile Cantelobre's visualization of the connections between people in the culinary world and the restaurants they've been involved in.
>Altman was touring globally to raise $7T (yes, trillion), just a few months ago.
A few points about that: It wasn't just a few months ago, it was more than a year ago; he wasn't raising funds for OpenAI, it was for a separate chip-building venture; the $7T number was universally regarded as laughable, and the whole thing has predictably gone nowhere; and, FWIW, both OpenAI and Altman have publicly denied that he was ever trying to raise such a comical amount.
My biggest gripe with this is the reasoning for “no, AI is not going to take everyone’s jobs.”
Obviously the state of “workers and AI together” comes first. So there’s not going to be some strawman cliff where suddenly all jobs are replaced in a month.
But people like him lay out this future and then act like AI just stops there. Of course there will be an intermediate state. And then that state will be passed over as AI move further up the chain and humans are eliminated from office labor entirely.
The way this manifests isn’t mass layoffs after an AI is implemented, it’s fewer people being hired at any given scale because you can go further with fewer people.
Companies making billions in revenue with under 10k employees, some under 5k or even under 1k.
This is absorbed by there being more and more opportunities because the cost of starting a new company and getting revenue decreases too as labor productivity increases.
Jobs that would otherwise exist get replaced. Jobs at companies that otherwise wouldn’t exist get created.
And in the long run until it’s just unprofitable to employ humans (when the max their productivity is worth relative to AI falls below a living wage), humans will continue working side by side with AGI as even relatively unproductive workers (compared to AI) will still be net productive.
I don’t think this is true. I think CEOs are replacing people on the assumption that AI will be able to replace their jobs. But I don’t think AIs are able to replace any jobs other than heavily scripted ones like front-line customer support… maybe.
I think AI can automate some tasks with supervision, especially if you’re okay with mediocre results and don’t need to spend a lot of time verifying its work. Stock photography, for example.
But to say AI is replacing jobs, I think you’d need to be specific about what jobs and how AI is replacing them… other than CEOs following the hype, and later backtracking.
> (when the max their productivity is worth relative to AI falls below a living wage), humans will continue working side by side with AGI as even relatively unproductive workers
This assumes that humans will be unwilling to work if their wage is below living. It depends on the social programs of the government, but if there is none, or only very bad ones, people will probably be more desperate and thus be more willing to work in even the cheapest jobs.
So in this overabundance of human labor world, the cost of human labor might be much closer to zero than living wage. It all depends on how desperate to find work government policy will make humans.
I've seen AI replacing a lot of jobs already in regulatory/consultancy business making billions. A lot of people producing paperwork for regulative etc purposes have been replaced by language models. My question – should this business really exist at all?
Can you though? From my experience this is just a wishful thinking. I am yet to see actual productivity gains from AI that would objectively justify hiring less or laying off people.
We can't prove why people are being replaced, and the people who claimed to have replaced people with AI don't have a lot of good outcomes. Now there is some success but... it is bespoke to that environment often, so, your reasons would be sound if the premise was. We need more information.
How about a concrete example. What jobs at Bank of America will humans have?
I can not imagine a scenario other than complete model stagnation that would lead to the current workforce there of 213,000 people still having jobs to do.
Also it’s such a strawman to assume that “absolutely everything” needs to be covered by systems to basically eliminate office jobs from humanity. In this context they only need to do enough to make hiring humans pointless at most businesses. Even if say you need some strategically important humans for a long time, I don’t see an incoming world where huge job loss doesn’t happen. David filing HR paperwork at Big Corp is not suddenly going to be doing strategy work.
Like it’s a strawman to assume I’m arguing that your nanny or the local firefighters are going to be replaced by an AI soon.
And it also seems like people expect some normative assumption when taking about job loss. I’m not making a normative claim, nor a policy one, just pointing out it seems stupid to not prepare or expect this to happen.
Honest question — why is the CEO of a Enterprise cloud storage company so out in the public touting the benefits of AI?
Conversely, why aren't the senior leaders of Google Drive, SharePoint, etc. so consistently speaking about the benefits of AI?
A couple top of mind thoughts on the range of outcomes:
-Perhaps it's more of a requisite function of a company that went through a long phase of mild-growth (even anemic relative to the Nasdaq composite) and is seeking to maintain relevance with PR/Marketing spend
-They are investing in a step function product roadmap that could accelerate revenue growth and are positioning the brand for a future product launch.
-A slightly more cynical view, the CEO is searching for a new home for the company? It's been over 20 years, which fees like long time for a public technology company to have a single CEO.
-The CEO is positioning himself personally for a new VC fund. He's had some success with previous investments and is looking to preemptively market his new fund.
-A slightly more realistic view, the CEO (who seems really enthusiastic) is honestly just like the rest of us and got nerd-swiped by this AI thing and feels genuinely excited about it :-)
Either way he seems very well networked and able to appear nearly everywhere and re-cap information and insights from other source material. I'm curious what the underlying incentives/outcomes are for this kind of PR blitz.
> Conversely, why aren't the senior leaders of Google Drive
Google execs are constantly talking about the benefits of AI. People lower on the totem pole just don't have the same platform as the top execs at Google do.
When was the last time Box was relevant? Dropbox is struggling, can only imagine what Box is going through. Their stock isn't great and the outlook is "limited growth".
Man you guys are pessimistic and stuck in your bubbles. Asking if box is relevant when they’ve had growth the last two quarters while Dropbox is on the verge of seeing a drop in numbers. Yes ms and Google are seeing more but the product box is putting out is light years ahead of Dropbox, egnyte, and, honesty, most of FAANG, in terms of supporting the average office worker. They aren’t building sexy AI coders but they are making it so Sue Ellen, the EA to the VP of fuckall, can automate things and make life easier for everyone.
I worked at box, albeit for only a year, and saw Aaron’s passion up close. He could step into an executive briefing center meeting and pull out projects the customer was working on a year prior that didn’t directly involve box and ask honest, empathetic questions about it.
He’s a nerd and loves talking about this stuff. Those saying he’s trying to remain relevant or find a new hole for box miss the fact he’s on cnbc and other business channels a few times a year just talking tech. Unless Houston who only shows up when Dropbox has a sad launch to tout.
Aaron is passionate about Box and has been fighting an uphill battle since day one but I’ve held on to my thousands of shares since I left and the value has doubled. I fully expect AI to accelerate their future and lead to a bigger balance sheet.
Aaron Levie strikes me as one of the sharpest and most prescient people in SV. I think you must be 100% correct that he's a true nerd about this stuff, or else he could have gone on to a thousand different things outside of Box by now.. he would probably be an incredible VC (I think I recall he was one of the first seed investors in Stripe as a result of cornering Patrick Collison at a house party, talking his ear off, and thoroughly impressing Collison).
Sue Ellen, the EA to the VP of fuckall .. I died
Part of the reason you're seeing pessimism about box is because the stock has barely kept up with inflation. Their IPO price was 23.23 USD, now it's at 32.30. That's 39% of growth while inflation over 10 years is 36%.
So sure their CEO is passionate and on CNBC but the business is clearly not doing that great
Stock price isn’t necessarily equivalent to how well the company is doing, though. I’d be more interested in hearing revenue and profit.
(Not asking you to go dig that up, just making the point that the market can be irrational.)
You can find this online pretty easily, they have only been profitable since 2023 (https://www.macrotrends.net/stocks/charts/BOX/box/eps-earnin...)
Irrational over 10 years? Nah. Thats a reflection that investors dont see much growth in earnings coming from future investments.
The claim that companies won't just build everything in-house because of wanting to be able to sue over damages seems extremely weak. If AI tools could genuinely build a replacement for Box with trivial effort, everyone would drop them immediately.
There's some strangeness around AI where people will claim it will do amazing things, but somehow not so amazing that no one will need them.
You miss the bigger point about context. Companies pick their focus and what to spend their time on. Post startup-scale, the cost of internal failures becomes exponential. AI might be able to whip up the exact same legal document or CRM or whatever, but when one thing goes wrong, that might lose $1 million in revenue. It is about paying for accountability over outcomes.
Do some math to see if it makes sense. If you have 10,000 employees, any software subscription that's $100 a person for all employees will be a million dollars a year. Many licenses are more than that. Even if you experience the million dollar outage, you'll have easily saved money.
I see some comments here that are critical of Aaron, but I enjoyed the talk. Pretty good insider info on trends in the "enterprise" vis-a-vis ai and saas in general.
Especially liked the framing of structured vs unstructured data, and the opportunity to make unstructured data more useful with llmns.
I do not have much criticize or add except to emphasize that ai as it is now is probabilistic, not deterministic -- ask the same question twice and you could get vastly different answers.
I remarked on this back in March, prompted by Théophile Cantelobre's visualization of the connections between people in the culinary world and the restaurants they've been involved in.
https://bjornwestergard.com/llm-extractors/
From the video: a lot of new companies are going to start
Well, there is this as well - https://finance.yahoo.com/news/openai-board-chair-doubles-do...
Altman was touring globally to raise $7T (yes, trillion), just a few months ago. So, not sure if they are purposefully trying to scare off people.
>Altman was touring globally to raise $7T (yes, trillion), just a few months ago.
A few points about that: It wasn't just a few months ago, it was more than a year ago; he wasn't raising funds for OpenAI, it was for a separate chip-building venture; the $7T number was universally regarded as laughable, and the whole thing has predictably gone nowhere; and, FWIW, both OpenAI and Altman have publicly denied that he was ever trying to raise such a comical amount.
My biggest gripe with this is the reasoning for “no, AI is not going to take everyone’s jobs.”
Obviously the state of “workers and AI together” comes first. So there’s not going to be some strawman cliff where suddenly all jobs are replaced in a month.
But people like him lay out this future and then act like AI just stops there. Of course there will be an intermediate state. And then that state will be passed over as AI move further up the chain and humans are eliminated from office labor entirely.
My biggest gripe with your reasoning is a hidden assumption that everything humans do is easily encodable in what we call AI today.
I don’t think this is the case.
AI and technology is already replacing jobs.
The way this manifests isn’t mass layoffs after an AI is implemented, it’s fewer people being hired at any given scale because you can go further with fewer people.
Companies making billions in revenue with under 10k employees, some under 5k or even under 1k.
This is absorbed by there being more and more opportunities because the cost of starting a new company and getting revenue decreases too as labor productivity increases.
Jobs that would otherwise exist get replaced. Jobs at companies that otherwise wouldn’t exist get created.
And in the long run until it’s just unprofitable to employ humans (when the max their productivity is worth relative to AI falls below a living wage), humans will continue working side by side with AGI as even relatively unproductive workers (compared to AI) will still be net productive.
> AI and technology is already replacing jobs
I don’t think this is true. I think CEOs are replacing people on the assumption that AI will be able to replace their jobs. But I don’t think AIs are able to replace any jobs other than heavily scripted ones like front-line customer support… maybe.
I think AI can automate some tasks with supervision, especially if you’re okay with mediocre results and don’t need to spend a lot of time verifying its work. Stock photography, for example.
But to say AI is replacing jobs, I think you’d need to be specific about what jobs and how AI is replacing them… other than CEOs following the hype, and later backtracking.
> (when the max their productivity is worth relative to AI falls below a living wage), humans will continue working side by side with AGI as even relatively unproductive workers
This assumes that humans will be unwilling to work if their wage is below living. It depends on the social programs of the government, but if there is none, or only very bad ones, people will probably be more desperate and thus be more willing to work in even the cheapest jobs.
So in this overabundance of human labor world, the cost of human labor might be much closer to zero than living wage. It all depends on how desperate to find work government policy will make humans.
I've seen AI replacing a lot of jobs already in regulatory/consultancy business making billions. A lot of people producing paperwork for regulative etc purposes have been replaced by language models. My question – should this business really exist at all?
> because you can go further with fewer people
Can you though? From my experience this is just a wishful thinking. I am yet to see actual productivity gains from AI that would objectively justify hiring less or laying off people.
This is pretty obvious when you know what to look for.
How many people did it take to build the pyramids? Now how many would it take today?
Look at revenue per head and how it’s trended
Look at how much AUM has flowed into asset management while headcount has flatlined
We can't prove why people are being replaced, and the people who claimed to have replaced people with AI don't have a lot of good outcomes. Now there is some success but... it is bespoke to that environment often, so, your reasons would be sound if the premise was. We need more information.
How about a concrete example. What jobs at Bank of America will humans have?
I can not imagine a scenario other than complete model stagnation that would lead to the current workforce there of 213,000 people still having jobs to do.
Also it’s such a strawman to assume that “absolutely everything” needs to be covered by systems to basically eliminate office jobs from humanity. In this context they only need to do enough to make hiring humans pointless at most businesses. Even if say you need some strategically important humans for a long time, I don’t see an incoming world where huge job loss doesn’t happen. David filing HR paperwork at Big Corp is not suddenly going to be doing strategy work.
Like it’s a strawman to assume I’m arguing that your nanny or the local firefighters are going to be replaced by an AI soon.
And it also seems like people expect some normative assumption when taking about job loss. I’m not making a normative claim, nor a policy one, just pointing out it seems stupid to not prepare or expect this to happen.
It will be encodable in what we call AI tomorrow
Except not literally tomorrow, of course. So you might as well say 1 million years from now...
Honest question — why is the CEO of a Enterprise cloud storage company so out in the public touting the benefits of AI?
Conversely, why aren't the senior leaders of Google Drive, SharePoint, etc. so consistently speaking about the benefits of AI?
A couple top of mind thoughts on the range of outcomes:
-Perhaps it's more of a requisite function of a company that went through a long phase of mild-growth (even anemic relative to the Nasdaq composite) and is seeking to maintain relevance with PR/Marketing spend
-They are investing in a step function product roadmap that could accelerate revenue growth and are positioning the brand for a future product launch.
-A slightly more cynical view, the CEO is searching for a new home for the company? It's been over 20 years, which fees like long time for a public technology company to have a single CEO.
-The CEO is positioning himself personally for a new VC fund. He's had some success with previous investments and is looking to preemptively market his new fund.
-A slightly more realistic view, the CEO (who seems really enthusiastic) is honestly just like the rest of us and got nerd-swiped by this AI thing and feels genuinely excited about it :-)
Either way he seems very well networked and able to appear nearly everywhere and re-cap information and insights from other source material. I'm curious what the underlying incentives/outcomes are for this kind of PR blitz.
> Conversely, why aren't the senior leaders of Google Drive
Google execs are constantly talking about the benefits of AI. People lower on the totem pole just don't have the same platform as the top execs at Google do.
And by Google, I mean Alphabet, not just search.
He's being a salesman, he wants to convince enterprises they need Box AI
When was the last time Box was relevant? Dropbox is struggling, can only imagine what Box is going through. Their stock isn't great and the outlook is "limited growth".
Box is relevant in enterprise space. It's a very different business than Dropbox.
“Feature not a product” looks like it was right in the long run.
He’s trying to stay relevant