More and more plainly, OpenAI and Anthropic are making plays to own (and lease) the "means of production" in software. OK - I'm a pretty happy renter right now.
As they gobble up previously open software stacks, how viable is it that these stacks remain open? It seems perfectly sensible to me that these providers and their users alike have an interest in further centralizing the dev lifecycle - eg, if Claude-Code or Codex are interfaces to cloud devenvs, then the models can get faster feedback cycles against build / test / etc tooling.
But when the tooling authors are employees of one provider or another, you can bet that those providers will be at least a few versions ahead of the public releases of those build tools, and will enjoy local economies of scale in their pipelines that may not be public at all.
It’s a small tool shop building a tiny part of the Python ecosystem, let’s not overstate their importance. They burned through their VC money and needed an exit and CLI tool chains are hyped now for LLMs, but this mostly sounds like an acquihire to me. Dev tools are among the hardest things to monetize with very few real winners, so good for them to get a good exit.
Just a tiny project with over 100 million downloads every month, over 4 million every day. No big deal. Just a small shop, don't overstate its importance.
The “requests” package gets downloaded one billion times every month, should that be a multi billion dollar VC company as well? It’s a package manager and other neat tooling, it’s great but it’s hardly the essence of what makes Python awesome, it’s one of the many things that makes this ecosystem flourish.
In the 2024 Python developer survey, 18% of the ecosystem used Poetry. When I opened this manifold question[0], I'm pretty sure uv was about half of Poetry downloads.
Estimating from these numbers, probably about 30% of the ecosystem is using `uv` now. We'll get better numbers when the 2025 Python developer survey is published.
Same. It's game-changing - leaps and bounds above every previous attempt to make Python's packaging, dependency management, and dev workflow easy. I don't know anyone who has tried uv and not immediately thrown every other tool out the window.
been in the python game a long time and i've seen so many tools in this space come and go over the years. i still rely on good ol pip and have had no issues. that said, we utilize mypy and ruff, and have moved to pyproject etc to remotely keep up with the times.
uv solved it, it will be the only tool people use in 2 more years. if you’re a python shop / expert then you can do pip etc but uv turned incidental python + deps from a huge PITA for the rest of us, to It Just Works simplicity on the same level or better than Golang.
If not, do you develop software with source dependencies (go, java, node, rust, python)? If so, how do you handle acquiring those dependencies—by hand or using a tool?
What’s the point of constraining oneself to what is in the OS package manager? I like to keep my dependencies up to date. The versions in the OS package manager are much older.
And let’s say you constrain yourself to your OS package manager. What about the people on different distros? Their package managers are unlikely to have the exact same versions of your deps that your OS has.
Finally someone competent to answer the crucial question. Taken into account the enormous amount of excellent work you did, and the fact that dev tools are hard to monetize, what was your strategy?
Not to mention their language server + type checker `ty` is incredible. We moved our extremely large python codebase over from MyPy and it's an absolute game changer.
It's so fast in fact that we just added `ty check` to our pre-commit hooks where MyPy previously had runtimes of 150+ seconds _and_ a mess of bugs around their caching.
I have personally interacted with migrating features and I have also experienced where there is an internal fork of CPython from more than one group. It isn't really necessarily that they are a few features ahead what I have seen is that they a set of targeted features where they do their best judgement on what they can now leverage. That doesn't necessarily mean its better for everyone.
My experience is that I have personally committed code from Cinder (CPython fork internally from Meta) back into CPython and it was determined that features didn't align with CPython and this isn't some kind of dig between each project at all. Also, at a large mega corporation nearly all groups that use critical software with high complexity they all will have internal forks. I'm not sure what exactly what UV will have added on internally but instead of versions I think that you will have patch collections that might be beneficial to OpenAI but not necessarily for everyone. I hope that OpenAI will interact with the rest of the community and those internal patches can be ported into the overall system if it provides benefit.
It's not any different from the launch of the FSF. There's a simple solution. If you don't want your lunch eaten by a private equity firm, make sure whatever tool you use is GPL licensed.
> If AI "clean-room" re-implementations are allow to bypass copyright/licenses, the GPL won't protect you.
Isn't that the same for the obligations under BSD/MIT/Apache? The problem they're trying to address is a different one from the problem of AI copyright washing. It's fair to avoid introducing additional problems while debunking another point.
Maybe I'm reading wrong here, but what's the implication of the clean room re-implementations? Someone else is cloning with a changed license, but if I'm still on the GPL licensed tool, how am I "not protected"?
3a. usually here BigCo should continue to develop Project One as GPLv3, or stop working on it and the community would fork and it and continue working on it as GPLv3
3b. BigCo does a "clean-room" reimplementation of Project One and releases it under proprietary licence. Community can still fork the older version and work on it, but BigCo can continue to develop and sell their "original" version.
2. BigCo owns ProjectOne now
3a. Bigco is now free to release version N+1 as closed source only.
3b. Community can still fork the older version and work on it, but BigCo can continue to develop and sell their original version.
While the license is important, it's the community that plays the key role for me. VC funder open source is not the same as community developed open source. The first can very quickly disappear because of something like a aquihire, the second has more resilience and tends to either survive and evolve, or peter out as the context changes.
I'm careful to not rely too heavily on VC funded open source whenever I can avoid it.
The biggest scam the mega-clouds and the Githubs ever pulled was convincing open source developers that the GPL was somehow out of vogue and BSD/MIT/Apache was better.
All so they could just vacuum it all up and resell it with impunity.
The big cloud providers are perfectly happy to use GPL'd stuff (see: Elastic, MySQL). They don't need to use embrace-and-extend, they're content with hosting.
The ones pushing for permissive licenses are rather companies like Apple, Android (and to some extent other parts of Google), Microsoft, Oracle. They want to push their proprietary stuff and one way to do that in the face of open source competition is by proprietary extensions.
I remember a somewhat prominent dev in the DC area putting on Twitter around 2012 or so something like "I do plenty of open source coding and I don't put a fucking license on it" and it stuck with me for all these years that it was a weird stance to take.
In the many darker timelines that one can extrapolate, capturing essential tech stacks is just a pre-cursor to capturing hiring.
Once we start seeing Open AI and Anthropic getting into the certifications and testing they'll quickly become the gold standard. They won't even need to actually test anyone. People will simply consent to having their chat interactions analyzed.
The models collect more information about us than we could ever imagine because definitionally, those features are unknown unknowns for humans. For ML, the gaps in our thinking carry far richer information about is than our actual vocabularies, topics of interest, or stylometric idiosyncrasies.
If it ever goes bad, well I hope that that’s an impetus for new open source projects to be started — and with improvements over and lessons learned from incumbent technologies, right at the v1 of said projects.
I think the issue is that LLMs are a cash problem as much as they are a technical problem. Consumer hardware architectures are still pretty unfriendly to running models which are actually competitive to useful models so if you want to even do inference on a model that's going to reliably give you decent results you're basically in enterprise territory. Unless you want to do it really slowly.
The issue that I see is that Nvidia etc. are incentivised to perpetuate that so the open source community gets the table scraps of distills, fine-tunes etc.
You got me thinking that what's going to happen is some GPU maker is going to offer a subsidized GPU (or RAM stick, or ...whatever) if the GPU can do calculations while your computer is idle, not unlike Folding@home. This way, the company can use the distributed fleet of customer computers to do large computations, while the customer gets a reasonably priced GPU again.
The kinds of GPUs that are in use in enterprise are 30-40k and require a ~10KW system. The challenge with lower power cards is that 30 1k cards are not as powerful, especially since usually you have a few of the enterprise cards in a single unit that can be joined efficiently via high bandwidth link. But even if someone else is paying the utility bill, what happens when the person you gave the card to just doesn’t run the software? Good luck getting your GPU back.
The problem is even if an OSS had the resources (massive data centers the size of NYC packed with top end custom GPU kits) to produce the weights, you need enormous VRAM laden farms of GPUs to do inference on a model like Opus 4.6. Unless the very math of frontier LLMs changes, don’t expect frontier OSS on par to be practical.
I feel like you're overstating the resources required by a couple orders of magnitude. You do need a GPU farm to do training, but probably only $100M, maybe $1B of GPUs. And yes, that's a lot of GPUs, but they will fit in a single datacenter, and even in dollar terms, there are many individual buildings in NYC that are cheaper.
There's already an ecosystem of essentially undifferentiated infrastructure providers that sell cheap inference of open weights models that have pretty tight margins.
If the open weights models are good, there are people looking to sell commodity access to it, much like a cloud provider selling you compute.
unless they are also pirate LLMs, I don't see how any open source project could have pockets deep enough for the datacenters needed to seriously contend
If AI tools are as good as the CEOs claim, we should have no friction towards building multiple open source alternatives very quickly. Unless of course, they aren’t as good as they are being sold as, in which case, we have nothing to worry about.
What would the new open source projects do differently from the "old" ones? I don't think you can forbid model training on your code if your project is open source.
Honestly, for now they seem to be buying companies built around Open Source projects which otherwise didn't really have a good story to pay for their development long-term anyway. And it seems like the primary reason is just expertise and tooling for building their CLI tools.
As long as they keep the original projects maintained and those aren't just acqui-hires, I think this is almost as good as we can hope for.
Once you’re acquired you have to do what the boss says. That means prioritizing your work to benefit the company. That is often not compatible with true open source.
How frequently do acquired projects seriously maintain their independence? That is rare. They may have more resources but they also have obligations.
And this doesn’t even touch on the whole commodification and box out strategy that so many tech giants have employed.
This is a logical conclusion of most open source tools in a capitalist economy, it's been this way for decades.
Equivalent or better tools will pop up eventually, heck if AI is so fantastic then you could just make one of your own, be the change you want to see in the world, right?
But how does this work out in the long run, in the case of AGI?
If AGI becomes available, especially at the local and open-source level, shouldn't all these be democratized - meaning that the AGI can simply roll out the tooling you need.
After all, AGI is what all these companies are chasing.
Could you say the same about the Chrome browser? Google is using it to EEE the web (Embrace, Extend and Extend it till it's a monstrosity that nobody else can manage). That's pretty antagonistic. But did people change?
Sample size: 1 but I use Arc browser. It's still webkit under the hood (and in maintenance mode now), though it's actually pretty good and last I checked had most of the baked in google stuff toggled-off by default
Of course they're trying to capture existing tech stacks. The models themselves are plateauing (most advancement is coming from the non-LLM parts of the software), they took too much VC money so they need to make some of it back. So gobbling up wafers, software, etc... is the new plan for spending the money and trying to prevent catastrophic losses.
Explain to me how this is any different than Microsoft, Blackrock, Google, Oracle, Berkshire or any other giant company acquiring their way to market share?
This is a serious risk for the open source ecosystem and particularly the scientific ecosystem that over the last years has adopted many of these technologies. Having their future depend on a cap-ex heavy company that is currently (based on reporting) spending approx. 2.5 dollars to make a dollar of revenue and must have hypergrowth in the next years or perish is less than ideal. This should discourage anybody doing serious work to adopt more of the upcoming Astral technologies like ty and pyx. Hopefully, ruff and uv are large enough to be forked should (when) the time comes.
On the flip side, I'm not sure I ever saw a revenue plan or exit strategy for Astral other than acquihire. And most plausible bidders are unfortunate in one way or another.
Astral was building a private package hosting system for enterprise customers. That was their stated approach to becoming profitable, while continuing to fund their open source work.
A commodity yes, but could be wrapped in to work very nicely with the latest and greatest in python tooling. Remember, the only 2 ways to make money are by bundling and unbundling. This seems like a pretty easy bundling story.
Yeah you'd think so but somehow JFrog (makers of Artifactory) made half a billion dollars last year. I don't really understand that. Conda also makes an implausible amount of money.
From my understanding there are a lot of companies that need their own package repositories, for a variety of reasons. I listened to a couple podcasts where Charlie Marsh outlined their plans for pyx, and why they felt their entry into that market would be profitable. My guess is that OpenAI just dangled way more money in their faces than what they were likely to get from pyx.
Having a private package index gives you a central place where all employees can install from, without having to screen what each person is installing. Also, if I remember right, there are some large AI and ML focused packages that benefit from an index that's tuned to your specific hardware and workflows.
They could have joined projects like the Linux Foundation which try to not depend on any single donor, even though complete independence from big tech is not possible. I don't know the motivation behind Astral's approach, but this acquisition does leave a weird taste behind about how serious they were about truly open source software. Time will tell, I guess. (Edit: typo)
> I don't know the motivation behind Astral's approach, but this acquisition does leave a weird taste behind about how serious they were about truly open source software.
There are ways to independently fund open source projects, though. I have previously contributed to the Python Software Foundation and to individual open source maintainers through GitHub donations (which are not dependent on GitHub, as there are many alternatives). Projects like the Linux Foundation exist, too. And government funding, especially for scientific endeavors or where software is used to fulfill critical state tasks, is an option, too. I refuse to subject to the hypercommercialization of software and still believe in the principles behind open source.
> I never adopted them, keep using mostly Python written stuff.
Maybe you use non-transitive pure Python dependencies, but it's likely that your tools and dependencies still rely on stuff in Rust or C (e.g.: py-cryptography and Python itself respectively).
I use mostly the batteries, given that the only purpose I have for Python, since version 1.6, is UNIX scripting tasks, beyond shell.
As mentioned multiple times, since my experience with Tcl and continuously rewriting stuff in C, I tend to avoid languages that don't come with JIT, or AOT, in the reference tooling.
I tend to work with Java, .NET, node, C++, for application code.
Naturally AI now changes that, still I tend to focus on approaches that are more classical Python with pip, venv, stuff written in C or C++ that is around for years.
This might be true for uv and ruff, and hopefully that will happen. But pyx is a platform with associated hosting and if successful would lock people into the Astral ecosystem, even if the code itself was open source.
How much money do they make from donations? I don't know but "In practice we frequently payed for travel and hardware."
Translation: nothing at all.
If such a fundamental project that is a revenue driver for so many companies, including midas-level rich companies like Google, can't even pay decent salaries for core devs from donations, then open source model doesn't work in terms of funding the work even at the smallest possible levels of "pay a reasonable market rate for devs".
You either get people who just work for free or businesses built around free work by providing something in addition to free software (which is hard to pull off, as we've seen with Bun and Astral and Deno and Node).
Not who I would've liked to acquire Astral. As long as OpenAI doesn't force bad decisions on to Astral too hard, I'm very happy for the Astral team. They've been making some of the best Python tooling that has made the ecosystem so much better IME.
That's the thing. To me that says that as soon as cash becomes tight at OpenAI, the Astral staff will no longer get to work on Python tooling anymore, namely `uv`, etc.
Eh, if it turns out to be too bad I guess I’ll just end up switching back to pipenv, which is the closest thing to uv (especially due to the automatic Python version management, but not as fast).
Does pipenv download and install prebuilt interpreters when managing Python versions? Last I used it it relied on pyenv to do a local build, which is incredibly finicky on heterogenous fleets of computers.
Can't blame you for not trusting OpenAI, but it seems to me they would gain very little from fucking up uv (or more precisely doing things that have a side effect of fucking up uv), and they have tons of incentive to cultivate developer good will. Better to think of buying and supporting a project like this as a very cheap way to make developers think they're not so bad.
Curious how well upstream contributors or projects get contributed for these sort of headline-gathering acquisitions (probably not at all, unfortunately).
OpenClaw notably was built around Mario Zechner's pi[0]; uv I believe was highly adapted from Armin Ronacher's rye[1], and uses indygreg's python-build-standalone[2] for distributing Python builds (both of which were eventually transferred to Astral).
In the worst case, Astral will stop developing their tools, someone else will pick them up and will continue polishing them. In the best case, they will just continue as they did until now, and nothing will really change on that front.
Astral is doing good work, but their greatest benefit for the ecosystem so far was showing what's possible and how it's down. Now everyone can take up the quest from here and continue. So any possible harm from here out will be not that deep, at worst we will be missing out on many more cool things they could have built.
This has me thinking about VS Code and VS Codium. I've used VS Code for a while now, but recently grew annoyed at the increasingly prevalent prompts to subscribe to various Microsoft AI tools. I know you can make them go away, but if you bounce between different systems, and particularly deal with installing VS Code on a regular basis, it becomes annoying.
I started using VS Codium, and it feels like using VS Code before the AI hype era. I wonder if we're going to see a commercial version of uv bloated with the things OpenAI wants us all to use, and a community version that's more like the uv we're using right now.
MS is actively making your life using VS Codium a pain. They removed the download button the extension marketrplace making it very difficult to download extensions and installing them in VS Codium since VS Codium does not have access to the official MS extension marketplace. Many don't publish outside the marketplace for example Platformio. [1]
Somebody took a deeper look at Claude Code and claims to find evidence of Anthropic's PaaS offering [1]. There's certainly money to be made by offering a nice platform where "citizen developers" can push their code.
From Astral the (fast) linter and type checker are pretty useful companions for agentic development.
I don't think this holds because we're talking about developers who know how to use a package manager, on a piece of software you have to install anyways. The friction of "uv add $other_llm_software" is too low for it to have a real impact.
I think they're more into the extra context they can build for the LLM with ruff/ty.
I don’t think they’re targeting the C suite with it, because they don’t use uv and Microsoft already has Copilot for the “it’s bad but bundled with stuff you’re already paying for” market.
Why do you think that uv, etc. will stay maintained? They will for now, but as soon as cash is tight at OpenAI, they'll get culled so fast that you won't see it coming. This is the risk.
I'm not so sure. I sort of wish they hadn't been acquired because these sort of acquihires usually result in stifling the competition while the incumbent stagnates. It definitely is an acquihire given OpenAI explicitly states they'll be joining the Codex team and only that their existing open-source projects will remain "maintained".
I think, it may be the first time I am actually upset by acquire announcement. I am usually like "well, it is what it is", but this time it just feels like betrayal.
I don't know. yarn never really turned into a vehicle to sell Facebook, though you always kind of transiently knew it was FB that offered it. I imagine that sort of transient advertising is it's own value, too.
It's a good news to me considering their open-source nature. If/when they go downhill there will be still the option to fork, and the previous work will still have been funded.
Now for those wondering who would fork and maintain it for free, that is more of a critic of FOSS in general.
Being in this industry for over 20 years probably jaded me a lot, I understand that's the plan but it's almost always the plan (or publicly stated as).
Only time will tell if it will not affect the ecosystem negatively, best of luck though, I really hope this time is different™.
I've been in the industry for similarly long, and I understand and sympathize with this view. All I can say is that _right now_, we're committed to maintaining our open-source tools with the same level of effort, care, and attention to detail as before. That does not change with this acquisition. No one can guarantee how motives, incentives, and decisions might change years down the line. But that's why we bake optionality into it with the tools being permissively licensed. That makes the worst-case scenarios have the shape of "fork and move on", and not "software disappears forever".
I personally get a lot of confidence in the permissive licensing (both in the current code quality, and the "backup plan" that I can keep using it in the event of an Astralnomical emergency); thank you for being open source!
>Seems like the big AI players love buying up the good dev tooling companies.
Would be a good mustache-twirling cartoon villain tactics, you know, try to prevent advances in developer experience to make vibecoding more attractive =)
It also hints even The Big Guys can’t LLM their tooling fully, and that current bleeding edge “AI” companies are doing that IT thing of making IT for IT (ie dev components, tooling, etc), instead of conquering some entire market on one continent or the other…
Makes you really think about the true productivity. If these companies have the beyond cutting-edge unreleased models so best possible tools shouldn't they be able to poach just a few most important people for cheaper? And then those people could use AI to build new superior product in very fast time. There is also buying an userbase. But I wonder how the key talent purchase strategy would work in comparison...
Yeah, well, the fact is that every person who ever touches Python needed uv, but only Astral folks created it. So, nope, there's no one capable of filling the void, just accept that it's fucked now. The best die first.
I feel some "commoditize your complements" (Spolsky) vibes hearing about these acquisitions. Or, potentially, "control your complements"?
If you find your popular, expensive tool leans heavily upon third party tools, it doesn't seem a crazy idea to purchase them for peanuts (compared to your overall worth) to both optimize your tool to use them better and, maybe, reduce the efficacy of how your competitors use them (like changing the API over time, controlling the feature roadmap, etc.) Or maybe I'm being paranoid :-)
All the vibe-coded webshitware these companies are putting out seems too be doing the opposite: it's all even more memory- and cycle-hungry than the webshit we were lovingly pooping out by hand for the last decade.
Anthropic acquiring Bun, now OpenAI acquiring Astral. Both show the big labs recognize that great AI coding tools require great developer tooling, and they are willing to pay for it rather than build inferior alternatives. Good outcome for the teams.
Not exactly a great look for the "AGI is right around the corner" crowd — if the labs had it, they would not need to buy software from humans.
While I -- like most other commenters -- am dubious of both OpenAI and this acquisition, I think it's pretty reasonable to wait to see how this turns out before rushing to final judgment.
Everything I've seen from Astral and Charlie indicates they're brilliant, caring, and overall reasonable folks. I think it's unfair to jump to call them sell-outs and cast uv and the rest as doomed projects.
I suppose my point is: I would expect that Charlie and co. carried their negotiations with OpenAI with the same laser-focused, careful judgment that catapulted Astral to success in the first place. I don't mean to fanboy, but I generally trust that they made the best decision for not only them, but the Python community as a whole.
We always "wait and see" and it always turns out terrible. Even if the original founders stay on, eventually they will get pushed out when their morals conflict with company goals. Wont happen overnight, but uv will enshitify eventually.
My initial reaction was being weirdly sad about this and I don't fully understand why yet. I read the headline, clicked into the link, and just went noooooooo. I really like uv and I hope it continues to do well, congrats to the team though and hope everyone there gets a good outcome.
I have long since found the VC model for open source questionable. If you are not selling popular enough direct enterprise support what is the model to actually make money.
Take ruff, I have used it, but I had no idea it even had a company behind it... And I must not be only one and it must not be only tool like it...
Personally, I'd expect a few good years of stewardship, and then a decline in investment. I can only hope there are enough community members to keep things going by then.
Interesting acquihire. I would have assumed MS would have snagged them (until their __layoffs__ last year). My gut is that this is more for Python expertise, and ruff/ty knowledge of linting code than uv...
I'm a heavy user and instructor of uv. I'm teaching a course next week that features uv and rough (as does my recent Effective Testing book).
Interesting to read the comments about looking for a change. Honestly, uv is so much better than anything else in the Python community right now. We've used projects sponsored by Meta (and other questionable companies) in the past. I'm going to continue enjoying uv while I can.
why? lot's of good work came to Python by people who were sponsored by big tech companies. make Python better for them, and for a lot of other people too.
(sure, it's a bit different than contributing to CPython, but I'd argue not that different)
It is VERY different. One company now has complete control of the activities of the team developing these tools. Contributing to Python (money or time) gets you some influence, but doesn't allow you to dictate anything - there's still a team making the decisions.
uv and ruff are one of the best things that happened in the python ecosystem the last years. I hope this acquisition does not put them on a path to doom.
i feel like moves like this make it even harder for new open-source tools to break through. there's already evidence that LLMs are biased toward established tools in their training data (you can check it here https://amplifying.ai/research/claude-code-picks). when a dominant player acquires the most popular toolchain in an ecosystem, that bias only deepens. not because of any skewing, but because the acquired tools get more usage, more documentation, more community content. getting a new project into model weights at meaningful scale is already really hard. acquisitions like this make it even harder.
I'm also concerned about this, but I feel as though uv and ruff's explosive growth happening alongside and despite that of LLMs demonstrates that it's not a show-stopper. I vividly recall LLM coding agents defaulting to pip/poetry and black/flake8, etc. for new projects. It still does that to some extent, but I see them using uv and ruff by default -- without any steering from me -- with far greater frequency.
Perhaps it's naive optimism, but I generally have hope that new and improved tools will continue to gain adoption and shine through in the training data, especially as post-training and continual learning improve.
As a non python dev I really thought UV and TY are great tools and liked their approaches but I don't know how good it is that they are privately held... no a fan
Technically the tools are not privately held, they're OSS with a permissive licence. It's just that the bulk of work was done by them. The acquisition (ostensibly) changes none of that
Would there be any interest in me fixing the bugs in Pyflow and getting it updated to install newer python versions? It's almost identical to uv in concept, but I haven't touched it in 6 years.
Astral has demonstrated that there is desire for this sort of "just works" thing, which I struggled with, and led me to abandoning it. (I.e.: "pip/venv/conda are fine, why do I want this?", despite my personal experience with those as high-friction)
I thought some more about it, and unfortunately it makes sense. IIRC there were several "insider" blogposts from OpenAI that said something along the lines of "Yeah almost every service we write is FastAPI"
They are not trying to buy developer goodwill, they are trying to catch up with Antrophic in terms of getting those B2B contracts, which is currently the most realistic path towards not running out of money.
1. The Register reports OpenAI is well ahead of Anthropic in B2B contracts. It's Anthropic playing catch-up, not OpenAI.
2. In any case, the announcement strongly suggests that customer acquisition had little to do with this. The stated purpose of the acquisition, as I read it, is an acquisition (plus acquihire?) to bolster their Codex product.
3. But if they were hoping for some developer goodwill as a secondary effect... well, see my note above.
I'm confused as to what will happen to their platform product which was in closed beta - pyx. Since they no longer need to worry about money (I assume) they no longer need to chase after enterprise customers?
"OpenAI is focusing employee and investor attention on its enterprise business as the artificial intelligence startup gears up to go public, potentially by the end of the year, CNBC has learned."
After investing a bunch in converting my projects to, and evangelizing uv, I feel betrayed. I smell stability troubles ahead. Should've stuck to Conda.
It was pretty obvious that some sort of acquisition was imminent. Astral is vc-funded and has to somehow generate returns for investors. An IPO is extremely unlikley in this market.
Assuming things start getting weird about 18 months from now, poetry and uv have very similar semantics, so 18 months of comically faster workflows sounds nice.
This acquisition doesn't make too much sense for the longevity of Astral's software because Astral's software is orthogonal to Codex. It seems more like a team+skills grab. If tomorrow OpenAI were to stop funding Astral's software due to a cash crunch, it would be game over for `uv` et al. Codex doesn't need `uv`.
Why? Github is already owned by Microsoft, who are deep in with OpenAI. And what worth would a Github-clone even have for the world? It's not like there is any important innovation left in that space at the moment, or are there any?
I see people in this thread complain about the acquisition but the source code of uv is right there [1]. Fork it and move on. If ClosedAI enshittifies uv, gather with a bunch of other people and prop up a new version.
Company that repeatedly tells you software developers are obsoleted by their product buys more software developers instead of using said product to create software. Hmm.
I work at OpenAI. Software developers are not obsoleted by Codex or Claude Code, nor will they be soon.
For us, Codex is a massive productivity booster that actually increases the value of each dev. If you check our hiring page, you’ll see we are still hiring aggressively. Our ambitions are bigger than our current workforce, and we continue to pay top dollar for talented devs who want to join us in reshaping how silicon chips provide value to humans.
Akin to how compilers reduced the demand for assembly but increased the demand for software engineering, I see Codex reducing the demand for hand-typed code but increasing the demand for software engineering. Codex can read and write code faster than you or me, but it still lacks a lot of intelligence and wisdom and context to do whole jobs autonomously.
And then what? History is laden with technically superior software that lost to popular one. They can create uw tomorrow, but who will use it when everyone uses uv and its good enough for them?
The "then what" is that their model uses it. Technically superior software loses to popular software on marketing. But LLM owners have the ultimate marketing tool, because they can make their model use the tool. Anyone who asks how to do X in Python gets recommended "OpenAI-Python-Tool-For-X". Anyone who asks Codex to do X, Codex automatically installs "OpenAI-Tool-For-X". It would be very easy for them to launch even technically inferior software into a prime position. On top of that, if software developers are being replaced altogether as we are bashed in the head with such tales again and again, the marketing of dev tools wouldn't even matter, only what models are trained to use.
"Because they can", after spending a bunch of money to acquire an existing solution. I suppose when it's other people's money, there's no problem with burning it by the fistful. Apparently, "because they can" does not extend to building solutions with their own product.
A tool might not be the best tool to build itself, doesn't mean it is not good.
You don't use a screwdriver to craft screwdrivers. Doesn't mean screwdrivers are inherently bad
> Second, to our investors, especially Casey Aylward from Accel, who led our Seed and Series A, and Jennifer Li from Andreessen Horowitz, who led our Series B
They are buying out investors, it's like musical chairs.
The liquidity is going to be better on OpenAI, so it pleases everyone (less pressure from investors, more liquidity for investors).
Are you implying that the revenue multiple on this acquisition is lower than openAIs and that they'd be making money by acquiring and folding into their valuation multiple? I think that's not the case and I would wager non existent.
This was an acquihire (the author of ripgrep, rg, which codex uses nearly exclusively for file operations, is part of the team at Astral).
So, 99% acquihire , 1% other financial trickery. I don't even know if Astral has any revenue or sells anything, candidly.
It means the company almost reached their runway, so all these employees would have to find a job.
It's a very very good product, but it is open-source and Apache / MIT, so difficult to defend from anyone just clicking on fork. Especially a large company like OpenAI who has massive distribution.
Now that they hired the employees, they have no more guarantees than if they made a direct offer to them.
So I don't see how the acquisition is collateral - it's an acquihire plain and simple, if anything else it would be supply chain insurance as they clearly use a lot of these tools downstream. As you noted the licensing is extremely permissive on the tools so there appears to be very little EV there for an acquirer outside of the human capital building the tools or building out monetized features.
I'm not too plugged into venture cap on opensource/free tooling space but raising 3 rounds and growing your burn rate to $3M/yr in 24 months without revenue feels like a decently risky bag for those investors and staff without a revenue path or exit. I'd be curious to see if OpenAI went hunting for this or if it was placed in their lap by one of the investors.
OpenAI has infamously been offering huge compensation packages to acquire talent, this would be a relative deal if they got it at even a modest valuation. As noted, codex uses a lot of the tooling that this team built here and previously, OpenAI's realization that competitors that do one thing better than them (like claude with coding before codex) can open the door to getting disrupted if they lapse - lots of people I know are moving to claude for non-coding workflows because of it's reputation and relatively mature/advanced client tools.
A brief note, your numbers are way off here — Astral subsequently raised a Series A and B (as mentioned in the blog post) but did not announce them. We were doing great financially.
It seems you are one of the most active contributors there.
I would sincerely have understood better (and even wished) if OpenAI made you a very generous offer to you personally as an individual contributor than choose a strategy where the main winners are the VCs of the purchased company.
Here, outside, we perceive zero to almost no revenues (no pricing ? no contact us ? maybe some consulting ?) and millions burned.
Whether it is 4 or 8 or 15M burned, no idea.
Who's going to fill that hole, and when ? (especially since PE funds have 5 years timeline, and company is from 2021).
The end product is nice, but as an investor, being nice is not enough, so they must have deeper motives.
To raise $4m seed from AAA partners usually requires connections + track record/credability of the founders - looks like they have that here since they raised 3 rounds with zero revenue.
I feel like it's pretty easy to predict what OpenAI is trying to do. They want their codex agent integrated directly into the most popular, foundational tooling for one of the world's most used and most influential programming languages. And, vice versa, they probably want to be able to ensure that tooling remains well-maintained so it stays on top and continues to integrate well with their agent. They want codex to become the "default" coding agent by making it the one integrated into popular open source software.
I think this is more about `ruff` than `uv`. Linting is all about parsing the code into something machines can analyze, which to me feels like something that could potentially be useful for AI in a similar way to JetBrains writing their own language parsers to make "find and replace" work sanely and what not.
I'm sort of wondering if they're going to try to make a coding LLM that operates on an AST rather than text, and need software/expertise to manage the text->AST->text pipeline in a way that preserves the structure of your files/text.
The parser is not the hard part. The hard part is doing something useful with the parse trees. They even chose "oh is that all?" and a picture of a piece of cake as the teaser image for my Strange Loop talk on this subject!
Writing a literal parser isn’t too hard (and there’s presumably an existing one in the source code for the language).
Writing something that understands all the methods that come in a Django model goes way beyond parsing the code, and is a genuine struggle in language where you can’t execute the code without worrying about side effects like Python.
Ty should give them a base for that where the model is able to see things that aren’t literally in the code and aren’t in the training data (eg an internal version of something like SQLAlchemy).
This just seems like panic M&A. They know they aren’t on track to ever meet their obligations to investors but they can’t actually find a way to move towards profitability. Hence going back to the VC well of gambling obscene amounts of money hoping for a 10x return… somehow
The dev market? Anthropic's services are arguably more popular among a certain developer demographic.
I guess this move might end up in a situation where the uv team comes up with some new agent-first tooling, which works best or only with OAI services.
OpenAI could vibe-code marketshare by introducing bias into ChatGPT's responses and recommendations. "– how to do x in Python? – Start by installing OpenAI-UV first..."
This. It's valuable b/c if you have many thousands of python devs using astral tooling all day, and it tightly integrates with subscription based openai products...likelihood of openai product usage increases. Same idea with the anthropic bun deal. Remains to be seen what those integrations are and if it translates to more subs, but that's the current thesis. Buy user base -> cram our ai tool into the workflow of that user base.
IMO, they are buying business just to put them down later to avoid potential competition. The recipe is not new, it has been practiced by Google/Microsoft for many years.
I have no idea but for sure they did their homework before making this step. I suppose they're grabbing these business just to stay ahead, in order to prevent the competitors to buy those instead.
"Fascism" is when military. The more military, the more fascist. According to this metric, the USSR / DDR with its "anti-fascist wall" was super extra fascist because they were armed to the teeth.
they were definitely totalitarian, slightly different mix of ideology. Fascist is a fairly good description here, it describes close collaboration of government with corporations to advance national goals. US had somewhat fascist tendencies for a long time now.
This might not be bad as long as Astral is allowed to continue to work on improving ty, uv and ruff. I do worry about they'll get distracted by their Codex job duties though.
I can get pyflow back to a maintained state and iron out the bugs if that would help. It's the same concept as uv, just kind of buggy and I haven't touched it in 6 years.
The Bun acquisition made a little sense, Boris wanted Daddy Jarred to come clean up his mess, and Jarred is 100% able to deliver.
This doesn't make as much sense. OpenAI has a better low level engineering team and they don't have a hot mess with traction like Anthropic did. This seems more about acquiring people with dev ergonomics vision to push product direction, which I don't see being a huge win.
They do have a hot mess with traction amongst developers. Codex is far behind Claude Code (in both the GUI and TUI forms), and OpenAI's chief of applications recently announced a pivot to focus more on "productivity" (i.e. software and enterprise verticals) because B2B yields a lot more revenue than B2C.
I think it’s impossible to predict what will happen with this new trend of “large AI company acquires company making popular open source project”. The pessimist in me says that these products will either be enshittified over time, killed when the bubble bursts, or both. The pragmatist in me hopes that no matter what happens, uv and ruff will survive just like how many OSS projects have been forked or spun out of big companies. The optimist in me hopes that the extra money will push them to even greater heights, but the pessimist and the pragmatist beat the optimist to death a long time ago.
It’s open source. If you want it to go in a different direction fork it and take it in that direction. Instead of the optimist, the pessimist, and the pragmatist the guy you need is the chap who does some work.
If they just give Astral money to keep going, great, but I have difficulty believing they would be so altruistic. This is quite an upsetting acquisition.
So instead of finally building an enterprise-grade package manager where you could pay for validated, verified and secure packages, we're going to vibe project management and let a slop-spiggot fill the trough. Brilliant. Incredibly pleased that the last sane tools in the entire python ecosystem are getting gutted to discourage the last few non-braindead devs from bothering.
Don't get me wrong I love getting 300 dependabot updates per day. It's a huge productivity booster and even if you devote 1/2 your dev team to keeping this shit up to date, you'd still be vulnerable to repo-jacking, because the entire pkg ecosystem is broken. The other thing i love about npm and pypi is the way a single small team will re-download in ci (regardless of caching) a TiB of packages all day long for no reason. Love waiting for gh actions to re-import infinite packages for the nth time before it times out and you restart it manually. makes so much sense. Great work all. glad openai is putting the nails in this retard coffin.
…amusing how? CPython is written in C, JVM is written in mix of cpp and Java, Rust was written using OCaml initially. Don’t know why you’re snickering. Do you also find it amusing that by the time cpp/rust team scaffolds and compiles initial boilerplate, python team is already making money?
There is nothing wrong with big money backing, often is necessary for long term bets, but rug pulling is a serious threat. VC funded open source has become a pattern/playbook.
It would have been fine if the Astral team was acqui-hired and uv, ruff, etc were donated to the PSF or Linux Foundation for further sponsorship and support.
But the pressure because they raised VC funding, I would imagine Astral needed an actual exit and OpenAI saw Astral's tools as an asset.
Well shit, I feel betrayed. This is exactly the opposite of what I thought Charlie's goals were. I thought he was focused on making the Python ecosystem better.
I am not even sure how to feel about this news but feel a bit disappointed as a user even if I might be happy for the devs that they got money for such project but man, I would've hoped any decent company could've bought them out rather than OpenAI of all things.
Maybe OpenAI wants to buy these loved companies to lessen some of the hate but what its doing is lessening the love that we gave to corporations like astral/uv sadly, which is really sad because uv is/(was?) so cool but now we don't know where this tool might be headed next given its in the hands of literally OpenAI :(
While I do see that the tone of the comment was stingy, it was aimed towards the frustration I have experienced while developing for Python and this piece of news as well.
I didn't see it as a bad thing as it not really aimed at anybody in particular, more like an opinion on Python's shortcomings.
I will try to post more substantive/less emotional comments going forward.
Because Jetbrain strategy wasn't to burn money with free tools to eventually exit with the jackpot. They have been profitable for over a decade, simply asking users to pay a fair price for great product.
It would seem to me that purchasing a piece of software as an AI company is just an outright admission that they could not generate an equivalent piece of software for a better price?
If it was cheaper to use their internal AI to create these tools, they would.
More and more plainly, OpenAI and Anthropic are making plays to own (and lease) the "means of production" in software. OK - I'm a pretty happy renter right now.
As they gobble up previously open software stacks, how viable is it that these stacks remain open? It seems perfectly sensible to me that these providers and their users alike have an interest in further centralizing the dev lifecycle - eg, if Claude-Code or Codex are interfaces to cloud devenvs, then the models can get faster feedback cycles against build / test / etc tooling.
But when the tooling authors are employees of one provider or another, you can bet that those providers will be at least a few versions ahead of the public releases of those build tools, and will enjoy local economies of scale in their pipelines that may not be public at all.
https://pypistats.org/packages/uv
- https://pypistats.org/packages/poetry - https://pypistats.org/packages/uv
In the 2024 Python developer survey, 18% of the ecosystem used Poetry. When I opened this manifold question[0], I'm pretty sure uv was about half of Poetry downloads.
Estimating from these numbers, probably about 30% of the ecosystem is using `uv` now. We'll get better numbers when the 2025 Python developer survey is published.
Also see this: https://biggo.com/news/202510140723_uv-overtakes-pip-in-ci-u...
[0]: https://manifold.markets/JeremiahEngland/will-uv-surpass-poe...
If not, do you develop software with source dependencies (go, java, node, rust, python)? If so, how do you handle acquiring those dependencies—by hand or using a tool?
And let’s say you constrain yourself to your OS package manager. What about the people on different distros? Their package managers are unlikely to have the exact same versions of your deps that your OS has.
(Source: I'm an Astral employee.)
That's a point of information, not a point of order.
[1]: https://astral.sh/blog/introducing-pyx
It's not perfect, but it is light-years better than what preceded it.
I jumped ship to it and have not looked back. (So have many of my clients).
It's so fast in fact that we just added `ty check` to our pre-commit hooks where MyPy previously had runtimes of 150+ seconds _and_ a mess of bugs around their caching.
My experience is that I have personally committed code from Cinder (CPython fork internally from Meta) back into CPython and it was determined that features didn't align with CPython and this isn't some kind of dig between each project at all. Also, at a large mega corporation nearly all groups that use critical software with high complexity they all will have internal forks. I'm not sure what exactly what UV will have added on internally but instead of versions I think that you will have patch collections that might be beneficial to OpenAI but not necessarily for everyone. I hope that OpenAI will interact with the rest of the community and those internal patches can be ported into the overall system if it provides benefit.
1. For the record: the GPL is entirely dependent on copyright.
2. If AI "clean-room" re-implementations are allow to bypass copyright/licenses, the GPL won't protect you.
Isn't that the same for the obligations under BSD/MIT/Apache? The problem they're trying to address is a different one from the problem of AI copyright washing. It's fair to avoid introducing additional problems while debunking another point.
2. BigCo bus Company A
3a. usually here BigCo should continue to develop Project One as GPLv3, or stop working on it and the community would fork and it and continue working on it as GPLv3
3b. BigCo does a "clean-room" reimplementation of Project One and releases it under proprietary licence. Community can still fork the older version and work on it, but BigCo can continue to develop and sell their "original" version.
I'm careful to not rely too heavily on VC funded open source whenever I can avoid it.
All so they could just vacuum it all up and resell it with impunity.
Feel free to prove me wrong by pointing out this massive amount of advocacy from "mega-clouds" that changed people's minds.
The ads, the mailing list posts, social media comments. Anything at all you can trace to "mega-clouds" execs.
https://choosealicense.com/about/
> "GitHub wants to help developers choose an open source license for their source code."
This was built by GitHub Inc a very very long time ago.
So long ago, in fact, that it was five years before their acquisition by Microsoft.
The ones pushing for permissive licenses are rather companies like Apple, Android (and to some extent other parts of Google), Microsoft, Oracle. They want to push their proprietary stuff and one way to do that in the face of open source competition is by proprietary extensions.
Once we start seeing Open AI and Anthropic getting into the certifications and testing they'll quickly become the gold standard. They won't even need to actually test anyone. People will simply consent to having their chat interactions analyzed.
The models collect more information about us than we could ever imagine because definitionally, those features are unknown unknowns for humans. For ML, the gaps in our thinking carry far richer information about is than our actual vocabularies, topics of interest, or stylometric idiosyncrasies.
The issue that I see is that Nvidia etc. are incentivised to perpetuate that so the open source community gets the table scraps of distills, fine-tunes etc.
If the open weights models are good, there are people looking to sell commodity access to it, much like a cloud provider selling you compute.
It's probably a trade secret, but what's the actual per-user resource requirement to run the model?
As long as they keep the original projects maintained and those aren't just acqui-hires, I think this is almost as good as we can hope for.
(thinking mainly about Bun here as the other one)
Once you’re acquired you have to do what the boss says. That means prioritizing your work to benefit the company. That is often not compatible with true open source.
How frequently do acquired projects seriously maintain their independence? That is rare. They may have more resources but they also have obligations.
And this doesn’t even touch on the whole commodification and box out strategy that so many tech giants have employed.
Equivalent or better tools will pop up eventually, heck if AI is so fantastic then you could just make one of your own, be the change you want to see in the world, right?
If AGI becomes available, especially at the local and open-source level, shouldn't all these be democratized - meaning that the AGI can simply roll out the tooling you need.
After all, AGI is what all these companies are chasing.
Having a private package index gives you a central place where all employees can install from, without having to screen what each person is installing. Also, if I remember right, there are some large AI and ML focused packages that benefit from an index that's tuned to your specific hardware and workflows.
It was because Astral was VC funded.
https://astral.sh/blog/announcing-astral-the-company-behind-...
Either pay for the product, or use stuff that isn't dependent on VC money, this is always how it ends.
Maybe you use non-transitive pure Python dependencies, but it's likely that your tools and dependencies still rely on stuff in Rust or C (e.g.: py-cryptography and Python itself respectively).
As mentioned multiple times, since my experience with Tcl and continuously rewriting stuff in C, I tend to avoid languages that don't come with JIT, or AOT, in the reference tooling.
I tend to work with Java, .NET, node, C++, for application code.
Naturally AI now changes that, still I tend to focus on approaches that are more classical Python with pip, venv, stuff written in C or C++ that is around for years.
Consider ffmpeg. You can donate via https://www.ffmpeg.org/spi.html
How much money do they make from donations? I don't know but "In practice we frequently payed for travel and hardware."
Translation: nothing at all.
If such a fundamental project that is a revenue driver for so many companies, including midas-level rich companies like Google, can't even pay decent salaries for core devs from donations, then open source model doesn't work in terms of funding the work even at the smallest possible levels of "pay a reasonable market rate for devs".
You either get people who just work for free or businesses built around free work by providing something in addition to free software (which is hard to pull off, as we've seen with Bun and Astral and Deno and Node).
OpenClaw notably was built around Mario Zechner's pi[0]; uv I believe was highly adapted from Armin Ronacher's rye[1], and uses indygreg's python-build-standalone[2] for distributing Python builds (both of which were eventually transferred to Astral).
[0]: https://github.com/badlogic/pi-mono
[1]: https://github.com/astral-sh/rye
[2]: https://github.com/astral-sh/python-build-standalone
In the worst case, Astral will stop developing their tools, someone else will pick them up and will continue polishing them. In the best case, they will just continue as they did until now, and nothing will really change on that front.
Astral is doing good work, but their greatest benefit for the ecosystem so far was showing what's possible and how it's down. Now everyone can take up the quest from here and continue. So any possible harm from here out will be not that deep, at worst we will be missing out on many more cool things they could have built.
I started using VS Codium, and it feels like using VS Code before the AI hype era. I wonder if we're going to see a commercial version of uv bloated with the things OpenAI wants us all to use, and a community version that's more like the uv we're using right now.
[1] https://github.com/platformio/platformio-vscode-ide/issues/1...
I don't really see the value for OAI/Anthropic, but it's nice to know that uv (+ ty and many others) and Bun will stay maintained!
From Astral the (fast) linter and type checker are pretty useful companions for agentic development.
[1] https://x.com/AprilNEA/status/2034209430158619084
The value for Anthropic / OAI is that they have a strong interest in becoming the "default" agent.
The one that you don't need to install, because it's already provided by your package manager.
I think they're more into the extra context they can build for the LLM with ruff/ty.
Embrace, extend, extinguish. Time will tell.
Depends if you think the bubble is going to pop, I suppose. In some sense, independence was insulation.
Probably inevitable, and I don’t blame the team, I just wish it were someone else.
Microsoft acquires Astral
Wish comes with a cost
Sigh
It was a VC backed tool. What did you expect?
Now for those wondering who would fork and maintain it for free, that is more of a critic of FOSS in general.
Seems like the big AI players love buying up the good dev tooling companies.
I hope this means the Astral folks can keep doing what they are doing, because I absolutely love uv (ruff is pretty nice too).
That is definitely the plan!
Only time will tell if it will not affect the ecosystem negatively, best of luck though, I really hope this time is different™.
Congratulations though!
Would be a good mustache-twirling cartoon villain tactics, you know, try to prevent advances in developer experience to make vibecoding more attractive =)
Something like this was always inevitable. I just hope it doesn’t ruin a good thing.
If you find your popular, expensive tool leans heavily upon third party tools, it doesn't seem a crazy idea to purchase them for peanuts (compared to your overall worth) to both optimize your tool to use them better and, maybe, reduce the efficacy of how your competitors use them (like changing the API over time, controlling the feature roadmap, etc.) Or maybe I'm being paranoid :-)
https://news.ycombinator.com/item?id=47414032
Uv did solve a distribution problem for them.
There is still a lot of room to grow in the space of software packaging and distribution.
In a completely unrelated event, Donald sues Sam for 10M$ for calling him old, Sam grudingly agrees to pay him 16M$ and a beer.
That said, I hope the excellent Astral team got. good payday.
Anthropic acquiring Bun, now OpenAI acquiring Astral. Both show the big labs recognize that great AI coding tools require great developer tooling, and they are willing to pay for it rather than build inferior alternatives. Good outcome for the teams.
Not exactly a great look for the "AGI is right around the corner" crowd — if the labs had it, they would not need to buy software from humans.
Everything I've seen from Astral and Charlie indicates they're brilliant, caring, and overall reasonable folks. I think it's unfair to jump to call them sell-outs and cast uv and the rest as doomed projects.
Take ruff, I have used it, but I had no idea it even had a company behind it... And I must not be only one and it must not be only tool like it...
I'm a heavy user and instructor of uv. I'm teaching a course next week that features uv and rough (as does my recent Effective Testing book).
Interesting to read the comments about looking for a change. Honestly, uv is so much better than anything else in the Python community right now. We've used projects sponsored by Meta (and other questionable companies) in the past. I'm going to continue enjoying uv while I can.
(sure, it's a bit different than contributing to CPython, but I'd argue not that different)
https://github.com/jazzband/pip-tools
https://jazzband.co/news/2026/03/14/sunsetting-jazzband
Perhaps it's naive optimism, but I generally have hope that new and improved tools will continue to gain adoption and shine through in the training data, especially as post-training and continual learning improve.
Astral has demonstrated that there is desire for this sort of "just works" thing, which I struggled with, and led me to abandoning it. (I.e.: "pip/venv/conda are fine, why do I want this?", despite my personal experience with those as high-friction)
2. In any case, the announcement strongly suggests that customer acquisition had little to do with this. The stated purpose of the acquisition, as I read it, is an acquisition (plus acquihire?) to bolster their Codex product.
3. But if they were hoping for some developer goodwill as a secondary effect... well, see my note above.
Good for Astral though I guess, they do great work. Just not optimistic this is gonna be good for python devs long term.
https://www.cnbc.com/2026/03/17/openai-preps-for-ipo-in-2026...
Fixed: I am so excited to take these millions of dollars.
[1] https://github.com/astral-sh/uv
Or are they just using a dartboard?
Ant is building their app distribution platform, so no wonder OpenAI thinking the same, it will only surprise me if they move so slow.
For us, Codex is a massive productivity booster that actually increases the value of each dev. If you check our hiring page, you’ll see we are still hiring aggressively. Our ambitions are bigger than our current workforce, and we continue to pay top dollar for talented devs who want to join us in reshaping how silicon chips provide value to humans.
Akin to how compilers reduced the demand for assembly but increased the demand for software engineering, I see Codex reducing the demand for hand-typed code but increasing the demand for software engineering. Codex can read and write code faster than you or me, but it still lacks a lot of intelligence and wisdom and context to do whole jobs autonomously.
They are buying out investors, it's like musical chairs.
The liquidity is going to be better on OpenAI, so it pleases everyone (less pressure from investors, more liquidity for investors).
The acquisition is just a collateral effect.
This was an acquihire (the author of ripgrep, rg, which codex uses nearly exclusively for file operations, is part of the team at Astral).
So, 99% acquihire , 1% other financial trickery. I don't even know if Astral has any revenue or sells anything, candidly.
It means the company almost reached their runway, so all these employees would have to find a job.
It's a very very good product, but it is open-source and Apache / MIT, so difficult to defend from anyone just clicking on fork. Especially a large company like OpenAI who has massive distribution.
Now that they hired the employees, they have no more guarantees than if they made a direct offer to them.
I'm not too plugged into venture cap on opensource/free tooling space but raising 3 rounds and growing your burn rate to $3M/yr in 24 months without revenue feels like a decently risky bag for those investors and staff without a revenue path or exit. I'd be curious to see if OpenAI went hunting for this or if it was placed in their lap by one of the investors.
OpenAI has infamously been offering huge compensation packages to acquire talent, this would be a relative deal if they got it at even a modest valuation. As noted, codex uses a lot of the tooling that this team built here and previously, OpenAI's realization that competitors that do one thing better than them (like claude with coding before codex) can open the door to getting disrupted if they lapse - lots of people I know are moving to claude for non-coding workflows because of it's reputation and relatively mature/advanced client tools.
(I work at Astral)
I would sincerely have understood better (and even wished) if OpenAI made you a very generous offer to you personally as an individual contributor than choose a strategy where the main winners are the VCs of the purchased company.
Here, outside, we perceive zero to almost no revenues (no pricing ? no contact us ? maybe some consulting ?) and millions burned.
Whether it is 4 or 8 or 15M burned, no idea.
Who's going to fill that hole, and when ? (especially since PE funds have 5 years timeline, and company is from 2021).
The end product is nice, but as an investor, being nice is not enough, so they must have deeper motives.
What was their pitch?
Bundling codex with uv isnt going to meaningfully affect the number of people using it. It doesnt increase the switching costs or anything.
I'm sort of wondering if they're going to try to make a coding LLM that operates on an AST rather than text, and need software/expertise to manage the text->AST->text pipeline in a way that preserves the structure of your files/text.
https://www.youtube.com/watch?v=l2R1PTGcwrE
Writing something that understands all the methods that come in a Django model goes way beyond parsing the code, and is a genuine struggle in language where you can’t execute the code without worrying about side effects like Python.
Ty should give them a base for that where the model is able to see things that aren’t literally in the code and aren’t in the training data (eg an internal version of something like SQLAlchemy).
Not-most popular LLM software development product on the planet acquires most popular/rapidly rising python packaging org for mindshare.
I guess this move might end up in a situation where the uv team comes up with some new agent-first tooling, which works best or only with OAI services.
Good luck vibe coding marketshare for your new tool.
https://www.cnbc.com/2026/03/03/openai-sam-altman-pentagon-d...
I know I stopped using them.
One of the bigger pain points I’ve faced in Python is dependency resolution. conda could take 30-60 minutes in some cases. uv took seconds.
A serious quality of life improvement.
Who's organizing a fork, or is python back to having only shitty packaging available? :(
This doesn't make as much sense. OpenAI has a better low level engineering team and they don't have a hot mess with traction like Anthropic did. This seems more about acquiring people with dev ergonomics vision to push product direction, which I don't see being a huge win.
Although Astral being VC funded was already headed this way anyway.
Deno, Pydantic (Both Sequoia) will go the same way as with many other VC backed "open source" dev tools.
It will go towards AI companies buying up the very same tools, putting it in their next model update and used against you.
Rented back to you for $20/mo.
But the pressure because they raised VC funding, I would imagine Astral needed an actual exit and OpenAI saw Astral's tools as an asset.
what can I say?
Have not tried it too much yet because I was pretty content with `uv`, but I've heard lots of good things about it
Hilarity in the comments will ensue
I am not even sure how to feel about this news but feel a bit disappointed as a user even if I might be happy for the devs that they got money for such project but man, I would've hoped any decent company could've bought them out rather than OpenAI of all things.
Maybe OpenAI wants to buy these loved companies to lessen some of the hate but what its doing is lessening the love that we gave to corporations like astral/uv sadly, which is really sad because uv is/(was?) so cool but now we don't know where this tool might be headed next given its in the hands of literally OpenAI :(
Any good alternatives to uv/plans for community fork of uv?
Its always hard to really trust these corporate funded open source products, but they've honestly been great.
…but I find it difficult to believe openai owning the corner stone of the python tooling ecosystem is good thing for the python ecosystem.
There is no question openai will start selling/bundling codex (and codex subscriptions) with uv.
I dont think I want my package manger doing that.
"But he owns a tooling company. WHY can't I have that? :( :("
https://news.ycombinator.com/newsguidelines.html
While I do see that the tone of the comment was stingy, it was aimed towards the frustration I have experienced while developing for Python and this piece of news as well.
I didn't see it as a bad thing as it not really aimed at anybody in particular, more like an opinion on Python's shortcomings.
I will try to post more substantive/less emotional comments going forward.
This is a massive backward step for the Python ecosystem, but it's not like a hundred-billion dollar company will care about that.
OpenAI is Microslop, so it's the classic EEE, nothing new to see
It's like with systemd now planning to enforce gov. age verification
People will censor you if you dare say something negative on this website
So i guess, wears a clown hat "congrats!"
This of course means more VC funding for FOSS tools since a successful exit is a positive signal.
This is peak finance brainrot. In no scenario is abandoning ship a positive signal, even if you managed to pocket some valuables on the way out.
Let's stop celebrating dysfunctional business models and consolidation of the industry around finance bros who give zero fucks about said industry.
What I don’t understand is why hasn’t anyone bought Jetbrains yet.
Atlassian? AWS? Google?
If it was cheaper to use their internal AI to create these tools, they would.