Launch HN: Onyx (YC W24) – Open-source chat UI

Hey HN, Chris and Yuhong here from Onyx (https://github.com/onyx-dot-app/onyx). We’re building an open-source chat that works with any LLM (proprietary + open weight) and gives these LLMs the tools they need to be useful (RAG, web search, MCP, deep research, memory, etc.).

Demo: https://youtu.be/2g4BxTZ9ztg

Two years ago, Yuhong and I had the same recurring problem. We were on growing teams and it was ridiculously difficult to find the right information across our docs, Slack, meeting notes, etc. Existing solutions required sending out our company's data, lacked customization, and frankly didn't work well. So, we started Danswer, an open-source enterprise search project built to be self-hosted and easily customized.

As the project grew, we started seeing an interesting trend—even though we were explicitly a search app, people wanted to use Danswer just to chat with LLMs. We’d hear, “the connectors, indexing, and search are great, but I’m going to start by connecting GPT-4o, Claude Sonnet 4, and Qwen to provide my team with a secure way to use them”.

Many users would add RAG, agents, and custom tools later, but much of the usage stayed ‘basic chat’. We thought: “why would people co-opt an enterprise search when other AI chat solutions exist?”

As we continued talking to users, we realized two key points:

(1) just giving a company secure access to an LLM with a great UI and simple tools is a huge part of the value add of AI

(2) providing this well is much harder than you might think and the bar is incredibly high

Consumer products like ChatGPT and Claude already provide a great experience—and chat with AI for work is something (ideally) everyone at the company uses 10+ times per day. People expect the same snappy, simple, and intuitive UX with a full feature set. Getting hundreds of small details right to take the experience from “this works” to “this feels magical” is not easy, and nothing else in the space has managed to do it.

So ~3 months ago we pivoted to Onyx, the open-source chat UI with:

- (truly) world class chat UX. Usable both by a fresh college grad who grew up with AI and an industry veteran who’s using AI tools for the first time.

- Support for all the common add-ons: RAG, connectors, web search, custom tools, MCP, assistants, deep research.

- RBAC, SSO, permission syncing, easy on-prem hosting to make it work for larger enterprises.

Through building features like deep research and code interpreter that work across model providers, we've learned a ton of non-obvious things about engineering LLMs that have been key to making Onyx work. I'd like to share two that were particularly interesting (happy to discuss more in the comments).

First, context management is one of the most difficult and important things to get right. We’ve found that LLMs really struggle to remember both system prompts and previous user messages in long conversations. Even simple instructions like “ignore sources of type X” in the system prompt are very often ignored. This is exacerbated by multiple tool calls, which can often feed in huge amounts of context. We solved this problem with a “Reminder” prompt—a short 1-3 sentence blurb injected at the end of the user message that describes the non-negotiables that the LLM must abide by. Empirically, LLMs attend most to the very end of the context window, so this placement gives the highest likelihood of adherence.

Second, we’ve needed to build an understanding of the “natural tendencies” of certain models when using tools, and build around them. For example, the GPT family of models are fine-tuned to use a python code interpreter that operates in a Jupyter notebook. Even if told explicitly, it refuses to add `print()` around the last line, since, in Jupyter, this last line is automatically written to stdout. Other models don’t have this strong preference, so we’ve had to design our model-agnostic code interpreter to also automatically `print()` the last bare line.

So far, we’ve had a Fortune 100 team fork Onyx and provide 10k+ employees access to every model within a single interface, and create thousands of use-case specific Assistants for every department, each using the best model for the job. We’ve seen teams operating in sensitive industries completely airgap Onyx w/ locally hosted LLMs to provide a copilot that wouldn’t have been possible otherwise.

If you’d like to try Onyx out, follow https://docs.onyx.app/deployment/getting_started/quickstart to get set up locally w/ Docker in <15 minutes. For our Cloud: https://www.onyx.app/. If there’s anything you'd like to see to make it a no-brainer to replace your ChatGPT Enterprise/Claude Enterprise subscription, we’d love to hear it!

133 points | by Weves 6 hours ago

31 comments

  • crocowhile 3 hours ago
    In a landscape where every week we have a different leading model, these systems are really useful for the power users because they keep the interface and models constant and allow to switch easily using API via openrouter or naga. I have been using openwebui which is under active development but I'll give this a try.
    • Weves 3 hours ago
      Yes, exactly! Would love to hear your feedback compared to OpenWebUI
  • visarga 3 hours ago
    That sidebar of past chats is where they go to be lost forever. Nobody came up with a UI that has decent search experience. It's like reddit internal search engine, but a bit worse.
    • aargh_aargh 2 hours ago
      Exactly. The best they came up with is a generated subject-like summary. So many options to explore here. Categorization by topic, by date, by customer account, clustering by topic, search with various ranking options, conversation(s) tree view, histogram per date/topic/account, integration with email, with an issue tracker, various states per chat/thread e.g. resolved/ongoing/non-viable, a knowledge bank to quickly save stuff you learned (code snippets, commands, facts), integration with Notion or a wiki etc etc. Just off the top of my head.

      I was told there would be rapid prototyping with AI. Haven't seen any of the above.

  • tomasphan 5 hours ago
    This is great, the value is there. I work for a F100 company that is trying (and failing) to build this in house because every product manager fundamentally misunderstands that users just want a chat window for AI, not to make their own complicated agents. Your biggest competition in the enterprise space, Copilot, has terrible UI and we only put up with it because it has access to email, SharePoint and Teams.
    • Weves 5 hours ago
      Haha, yea we've seen that exact story many times! Dissatisfied with Copilot and building a (not great) internal solution that is missing polish + most of the "advanced" feature set.
    • katzskat 5 hours ago
      I immediately thought of Google's Agentspace when I saw this product. The value for me sits in its ability to do RAG via connectors.
      • Weves 5 hours ago
        RAG + connectors is a huge reason why people deploy Onyx (enterprise search roots means we do a pretty good job there).

        Also, open-source works really here, since connectors are a long-tail game. We've tried to make it easy to add connectors (a single python interface), and as a result over half of our connectors are contributed by the community. We expect that percentage to grow over time. This means that compared to something like Agentspace, we'll very likely be connected to all of the key tools at your company (and if we aren't, you can easily add an integration).

  • rao-v 5 hours ago
    I was pretty excited for Onyx as a way to stand up a useful open source RAG + LLM at small scale but as of two weeks ago it was clearly full of features ticked off a list that nobody has actually tried to use. For example, you can scrape sites and upload docs but you can’t really keep track of what’s been processed within the UI or map back to the documents cleanly.

    It’s nice to see an attempt at an end to end stack (for all that it seems this is “obvious” … there are not that many functional options) but wow we’ve forgotten the basis of making useful products. I’m hoping it gets enough time to bake.

    • Weves 5 hours ago
      Really appreciate the feedback (and glad to hear the core concept resonated with you).

      The admin side of the house has been missing a bit of love, and we have a large overhaul coming soon that I'm hoping addresses some (most?) of your concerns. For now, if you'd like to view documents that have been processed, you can check out the `Explorer` panel on the left.

      In general, I'd love to hear more about what gives it that "unbaked" feel for you if you're up for a quick chat.

  • haolez 1 hour ago
    No, thank you. It is a VC-backed open source tool. At some point, you will "enshitificate" your product and/or squeeze me until it becomes unaffordable.

    I could just have kept this "negative" thought to myself, but maybe other lurkers think the same. Something for you guys to have in mind. Good luck!

    • Weves 20 minutes ago
      Totally get this concern.

      How we think about it: the chat product should be completely open-source and free (forever). To that end we've moved features like SSO (that used to be "enterprise") to be MIT licensed. The chat interface is something pretty much every team needs (be it a proprietary or open-source solution). You can think of this like Apache Spark for Databricks or Ray for Anyscale.

      Also, as other folks have pointed out in the thread, there are quite a few other open source options out there. So there's a a ton of outside pressure for our open-source only offering to be very competitive. We hope this reduces the "enshitification" risk that you speak of.

  • CuriouslyC 2 hours ago
    Something like this has a very limited shelf life as a product. What users need from chat is very user specific, trying to be the one chat to rule them all is not gonna end well, and as models get more capable each chat experience is going to need to be more customized.

    Something like this could have a nice future as an open source chat framework for building custom UIs if it's well made and modular, but that isn't gonna work well with a SaaS model.

    • bks 2 hours ago
      I've been using Onyx (and Danswer before it) for over a year, and I'd push back on this. We have Freshdesk, Bookstack, Google Drive, YouTrack, and Slack all connected. It seamlessly answers questions like:

      "What's Max's GitHub username?" "I need wire transfer instructions for an incoming wire"

      We also index competitors' helpdesks and KB articles to track new features they're rolling out. Our tech support team uses it daily because Freshdesk's AI is terrible and their internal KB search is lackluster. Onyx actually finds things. The value isn't in being "one chat to rule them all" — it's in unified search across disparate systems with citations. That's not getting commoditized anytime soon. Keep up the good work, team.

    • jstummbillig 2 hours ago
      I disagree. This has both APIs as well as connectors. One of the reasons I use Google Workspace as SaaS is because of the extensive API, that gives me the flexibility I need with a great starting point (and continued development, that I continue to benefit from).
      • CuriouslyC 2 hours ago
        Yes, but imagine a chat app that's designed for accountants, that has widgets for accounting, and it's set up for accounting workflows. That's _HUGE_ but not something that a "one chat to rule them all" is going to just go and do. You could use that same example for lab technicians and any other role.
    • Weves 2 hours ago
      Hmm, will have to disagree here. I think "one chat to rule them all" is the way it will end.

      It does requires having UI components for many different types of interactions (e.g. many ways to collect user input mid-session + display different tools responses like graphs and interactives). With this, people should be able to easily build complex tools/flows on top of that UI, and get a nice, single interface (no siloed tools/swapping) for free. And having this UI be open-source make this easier.

      • CuriouslyC 2 hours ago
        I agree with an end state something like you describe, but I don't think it will be a chat app, I think you'll have an agent lives outside your apps, that managers your apps.
  • NickHoff 44 minutes ago
    Look neat. FYI clicking "See All Connectors" is a 404.
    • Weves 25 minutes ago
      Thanks! And 404 should be fixed
  • asdev 4 hours ago
    Aren't most of the large frontier model providers SOC 2 compliant? I think AWS Bedrock is also SOC 2 compliant. Not sure why you would need to self host anything then as you'll get turnkey secure solutions from the bigger plays
    • Weves 3 hours ago
      Outside of pure security, we've seen many choose Onyx for flexibility + connectedness.

      One of our largest users has forked the repo and has 20+ commits back to the repo of small customizations that are important for them (and that they could never get with ChatGPT Enterprise).

      Lots of companies we talk to value having the best model for the job (e.g. not being tied to ONLY OpenAI models for example).

      Compared to model provider offerings, we also (thanks to open-source contributions) cover many more existing apps when it comes to connectors.

  • jryio 3 hours ago
    Do you know what's completely missing from all of these products like anything LLM and Onyx...

    a mobile application that has parity on the same features that ChatGPT and Claude does...

    • Weves 3 hours ago
      Hmm, yea that's a great callout. Something we definitely have in our sights longer term (focus for now is to make sure that the desktop chat experience is truly amazing).
    • bilekas 3 hours ago
      I hope you mean "parity" no?
  • panki27 4 hours ago
    What does this do that OpenWebUI (or one of the many of other solutions) does not?
    • limagnolia 1 hour ago
      OpenWebUI isn't Open Source anymore. Open WebUI has an egregious CLA if I want to contribute back to it (Which I wouldn't do anyway because it isn't Open Source...)

      Onyx Devs: This looks awesome, I will definitely add it to my list of things to try out... close to the top! Thanks, and please keep it cool!

    • hobofan 4 hours ago
      As someone building another competitor in the field, I'll relay some reasons why some of our customers ruled out OpenWebUI in their decision-making process:

      - Instability when self-hosting

      - Hard to get in touch with sales when looking for SLA-based contracts

      - Cluttered product; Multiple concepts seemingly serving the same purpose (e.g. function calling vs. MCP); Most pre-MCP tools suffer from this

      - Trouble integrating it with OIDC

      - Bad docs that are mostly LLM generated

    • Weves 4 hours ago
      Broadly, I think other open source solutions are lacking in (1) integration of external knowledge into the chat (2) simple UX (3) complex "agent" flows.

      Both internal RAG and web search are hard to do well, and since we've started as an enterprise search project we've spent a lot of time making it good.

      Most (all?) of these projects have UXs that are quite complicated (e.g. exposing front-and-center every model param like Top P without any explanation, no clear distinction between admin/regular user features, etc.). For broader deployments this can overwhelm people who are new to AI tools.

      Finally trying to do anything beyond a simple back and forth with a single tool calls isn't great with a lot of these projects. So something like "find me all the open source chat options, understand their strengths/weaknesses, and compile that into a spreadsheet" will work well with Onyx, but not so well with other options (again partially due to our enterprise search roots).

  • dannylmathews 4 hours ago
    The license on this project is pretty confusing. The license at the root of the project links to backend/cc/LICENSE.md which says you need a subscription license to use the code.

    Can you call it open source if you need a subscription license to run / edit the code?

    • Weves 4 hours ago
      You don't need any subscription to run the code! By default, none of the enterprise code runs (and it can all be completely removed and the app will work as expected). Fully FOSS version here: https://github.com/onyx-dot-app/onyx-foss.
      • dannylmathews 3 hours ago
        that's fair. What does the enterprise code do vs the FOSS?
        • bilekas 3 hours ago
          As I see it has whitelisting and enterprise integrations.. as for the OS version maybe you need to roll your own. This is a usual monetization method though.
        • Weves 3 hours ago
          All of the core chat UX + "add-ons" is in FOSS!

          In the enterprise:

          - permission syncing

          - UI-based white labeling

          - Advanced RBAC

          - Usage analytics UI

    • martypitt 4 hours ago
      It's not really confusing at all.

      Content under backend/ee requires a license, everything else is MIT Expat. Pretty standard stuff.

      > Can you call it open source if you need a subscription license to run / edit the code?

      MIT is open source, their other stuff isn't. Pretty clear.

    • maxloh 2 hours ago
      That's exactly the same approach employed by Gitlab and is actively being deployed and used by GNOME and F-Droid.

      Could you elaborate why this approach is confusing?

    • csomar 4 hours ago
      Yes. Open source doesn’t mean free.
      • fhd2 4 hours ago
        It really does, by any definition I've ever heard. I suppose the authoritative one would be [1].

        A common "trick" for commercial open source software is to use a copyleft license, which restricts redistribution as part of commercial products, and to offer a paid license to get around that.

        [1]: https://opensource.org/osd

        • embedding-shape 3 hours ago
          Nothing in that "authoritative" definition says you cannot charge for binaries, for example. It's talking mainly about source code itself. Something you just publish the source for but charge for anything else, would be fair game and still "open source" by that definition.
          • fhd2 2 hours ago
            Agreed, "free" is too broad.

            I was responding to parent's question though: "Can you call it open source if you need a subscription license to run / edit the code?"

            I'd say no. If you have the code in front of you, it shouldn't require a license to run. Even if the whole point of the open source software is to interact with a proprietary piece of software or service, you could still run it for free, it probably just wouldn't have much utility.

        • fragmede 4 hours ago
          GNU disagrees.

          > Many people believe that the spirit of the GNU Project is that you should not charge money for distributing copies of software, or that you should charge as little as possible—just enough to cover the cost. This is a misunderstanding.

          > Actually, we encourage people who redistribute free software to charge as much as they wish or can. If a license does not permit users to make copies and sell them, it is a nonfree license.

          https://www.gnu.org/philosophy/selling.html

          • fhd2 2 hours ago
            Fascinating, from skimming that, it does indeed appear that it would be within the GNU philosophy to distribute source code solely in exchange for payment. Doesn't cover a case where the source code is _already_ distributed though, then it's free to run.

            And even if the source code was only distributed to paying customers, that'd likely be a temporary situation. A relevant quote:

            "With free software, users don't have to pay the distribution fee in order to use the software. They can copy the program from a friend who has a copy, or with the help of a friend who has network access."

            I do read the GPLv3 such that if someone _does_ buy the code in any fashion, you must provide the source code to them for free. Relevant excerpt from section 6:

            "[...] give anyone who possesses the object code either (1) a copy of the Corresponding Source for all the software in the product that is covered by this License, on a durable physical medium customarily used for software interchange, for a price no more than your reasonable cost of physically performing this conveying of source, or (2) access to copy the Corresponding Source from a network server at no charge."

            But yeah, no obligation to provide the source code for free to non-customers, fair point. Just no ability to stop customers from sharing it with non-customers. Does make sense.

      • derleyici 4 hours ago
        No, they must then state that it is source-available, not open source.
  • hobofan 4 hours ago
    Congrats on the launch!

    We are building a competing open source tool[0] with a very similar focus (strongly relying on interoperable standards like MCP; built for enterprise needs, etc.), though bootstrapping with customers rather than being VC funded. It's nice to see a competitor in the field following similar "OSS Friends" principles, while many of the other ones seem to have strong proprietary tendencies.

    (Small heads up: The "view all integrations" button goes to a 404)

    [0] https://erato.chat/

  • thedangler 2 hours ago
    If I'm understanding this right. I can train it on private data only for my company and then I can use the chat bot only for my site to acquire customers?
  • bilekas 3 hours ago
    Interesting product and best of luck with it.

    > but I’m going to start by connecting GPT-4o, Claude Sonnet 4, and Qwen to provide my team with a secure way to use them

    I did get a little giggle out of that because I've never heard anyone say that hooking up 3rd party llms to anything was any way secure.

    • Weves 3 hours ago
      Thanks for the kind words!

      The key point there is that many would do it through Azure / Bedrock + locally host the open-source models. Also, all chats / indexed data lives on-prem, and there are better guarantees around retention when using the APIs directly.

      • bilekas 3 hours ago
        Ah I see.. That makes a bit more sense and definitely adds a value multiplier for enterprises I would imagine! I'll try out the open source one and see how it works out!
      • thinkloop 2 hours ago
        Is running your llm through azure insecure? I mean more so than running anything on cloud? My understanding was that azure gpt instances were completely independent with the same security protocols as databases, vms, etc.
        • bilekas 11 minutes ago
          Azure wouldn't be if you have your company AD/Oauth, I'm GUESSING running local models with data transfer might expose that communication if your local machine is compromised, or someone else's, potentially is multiple points of leakage, companies generally like to limit that risk. This is all an assumption btw.

          Edit : grammar

  • jbuendia829 2 hours ago
    Congrats on the launch!

    Curious, it's a crowded space with other enterprise search companies like Glean and Elastic, and other companies coming in like Notion and Slack.

    Why should a prospect choose Onyx over the others?

    • Weves 2 hours ago
      Great question! Depends on the specific alternative, but the broad points are:

      - "pure chat" experience. From our community (and personal use), we've observed that most queries don't actually involve enterprise search. They much more likely to just require the LLMs internal knowledge (or web search / code execution). Compared to all the companies you've mentioned, we've spent a lot more time refining this more common flow.

      - Larger connector suite. As soon as one key source isn't connected, the trustworthiness of the system is dramatically decreased. You second guess "is the info needed to answer this question in there?" for every question. We have a community who builds out connectors for themselves, and then contribute it back for everyone to use. This allows us to cover the long-tail better than companies like Notion and Slack.

      - Customizability. An open-source application is the perfect middle ground between a SaaS offering and building blocks. A SaaS option doesn't allow for any customization (we have many customers who have contributed back ux enhancements, small features like guardrails, or enhanced configurations that their users want). Building blocks demand too much domain expertise (search, frontend/UX, ...) for it to be realistic for companies to build something great.

  • terminalkeys 2 hours ago
    The UI looks so close to Open WebUI I was shocked this isn't a fork. It even looks like it takes OWUI's unique model customization features, but makes it agents.

    Might have to try this out. OWUI's lagging docs has made managing my own self hosted instance a pain.

    PS: Your _See All Connectors_ button on the homepage is 404ing.

    • Weves 1 hour ago
      Haha, yea the UIs certainly have similarities (much of the industry converges to standard places to put different components, since users are familiar).

      "Agents" is a particular area where we feel like we're better than the alternatives (especially if you want something that effectively calls multiple tools in sequence). Curious to hear your thoughts after trying it out!

  • alalani1 3 hours ago
    Do you let organizations white-label it so its more customized (i.e. remove the Onyx branding, preload it with their internal MCP servers / docs) and feels like their own internal chat tool?
    • Weves 3 hours ago
      Yes absolutely! There's no license restriction on white-labeling, so we've seen lots of companies do that.

      In our opinion, it's a bit silly to build completely in house when you can take something like Onyx as the starting point and be >95% of the way there + have a tons of bells and whistles built in.

  • pablo24602 5 hours ago
    Congrats on the launch! Every enterprise deserves to use a beautiful AI chat UI (and Onyx is a fantastic and easy to try option).
  • dberg 5 hours ago
    how is this different from Librechat?
    • Weves 4 hours ago
      Some of the key differences:

      1/ large connector suite + good RAG. Answers at scale is hard, and from our enterprise search roots, we've spent a lot of time with it. It's something that many teams expect from their chat UI.

      2/ deep research + open-source code interpreter.

      3/ simpler UX. LibreChat has a lot of customizability exposed front and center to the user, which is great for the power user but can be overwhelming for someone new to using AI systems.

    • hobofan 4 hours ago
      LibreChat has recently been acquired by ClickHouse, so who knows what their future holds.
  • diebillionaires 4 hours ago
    Seems like AnythingLLM
  • winddude 4 hours ago
    why use a name, although spelled differently onyx vs onnx, that's already used and known in the ML/AI community?
    • Weves 3 hours ago
      That's a fair point! We were aware of onnx, but felt it was okay since they are very different products so we felt that there wouldn't be too much confusion (people generally know which onyx/onnx they are looking for).
      • embedding-shape 3 hours ago
        You're almost literally in the same ecosystem, it's not like one is a Chat UI for LLMs and the other a super market, but a ecosystem of open source machine learning software, libraries and tools. That the pronunciation is identical makes it untenable, you really need to reconsider the name, discussions in person will get confusing.
  • awaseem 5 hours ago
    This is awesome and love that its open source!
    • Weves 5 hours ago
      Thanks! Open source is awesome :)
  • mentalgear 5 hours ago
    A bit like mastra.ai - my goto SOTA solution for these kind of LLM flow coordinations (though more dev-focused). (yes I realise this is more user-facing)
  • simianparrot 3 hours ago
    ... just because OpenAI call their product "ChatGPT" doesn't mean the term "chat" should now mean "interacting with an LLM".
  • KaoruAoiShiho 5 hours ago
    I've been using Cherry Studio, works great.
  • nawtagain 5 hours ago
    Congrats on the launch!

    Can you clarify the license and if this actually meets the definition of Open Source as outlined by the OSI [1] or if this is actually just source available similar to OpenWebUI?

    Specifically can / does this run without the /onyx/backend/ee and web/src/app/ee directories which are licensed under a proprietary license?

    1 - https://opensource.org/licenses

    • Weves 5 hours ago
      Yes, it absolutely does! The chat UX, add-ons (deep research, code interpreter, RAG, etc.), and SSO are MIT licensed. Most deployments of Onyx are using the pure FOSS version of Onyx. Many individuals / teams have done extensive white labeling (something that OpenWebUI doesn't allow).

      We have https://github.com/onyx-dot-app/onyx-foss, for a fully MIT licensed version of the repo if you want to be safe about the license/feel freedom to modify every file.

  • Der_Einzige 5 hours ago
    Sorry, but why use this over oobabooga/sillytavern?

    Why do we have to yet again poorly copy an oversimplified UI?

    The value of local models comes from their huge amount of settings/control that they offer. Why must we throw that all away?

    Yet again, the world waits for good UI/UX for pro/prosumers with AI systems. No one is learning from ComfyUI, Automatic1111, or SillyTavern. No, LM-Studio is not actually prosumer

    • Weves 3 hours ago
      We're definitely looking to add back some of that flexibility / customizability. I don't think you have to sacrifice a nice, simple UI to provide what power users are looking for.

      For now, the main reasons for a prosumer to use over oobabooga/sillytavern are around the base tool set we provide and the "agent loop". If you ever want to use your single chat interface to do data analysis (code interpreter), multi-step realtime research (deep research), or RAG over large scale data (hybrid search), Onyx would be a particularly good choice.

    • hobofan 4 hours ago
      You are not the target user.

      The target customers are enterprises where 70% of the target users have never used an AI system before and have to go through a AI training before being allowed access to the system.

  • polynomial 4 hours ago
    Onyx?

    And no one bothered to say anything to them?

  • turblety 4 hours ago
    What is it with these Chat apps having strange and not-real open source licenses? OpenWebUI is the same. Is there something about these chat apps that seems to make them more prone to weird and strange licenses? Just opportunist?
    • hobofan 4 hours ago
      MIT core + "ee" (enterprise edition) commercially licensed extension subdirectory isn't that strange of a occurrence nowadays.

      I also wouldn't pin it as chat app specific. Quite a few VC funded open core software has adopted that pattern post ~2020(?): cal.com, Dagster, Gitlab

      • Weves 4 hours ago
        Yea, the license is modeled after the Gitlab license. All of the core chat/RAG/agent logic is fully MIT, and >99% of deployments of Onyx are using the "community edition"!
    • cjonas 4 hours ago
      Copilotkit is in the same boat. There are parts of the open source codebase that require an enterprise license to use. Basic things like "on error" handlers that are completely offline features. (They might have moved away from this, I haven't checked in a while)
    • observationist 3 hours ago
      If you tack on these faux-pen source VC licenses and complicate things, you're signaling dishonesty and dark patterns. It might not be the case, but it's not a good look imo. VCs don't seem to care, though - it's all about securing the future payoff, doesn't matter what principles or norms get trampled in the process, and it's only a small set of FOSS nerds that ever get bothered by it, anyway.

      Thanks, lawyers, you make everything better!

    • scotty79 4 hours ago
      New tech draws new people. New people have new ideas. Also for licenses.
  • _pdp_ 5 hours ago
    > We’re building an open-source chat that works

    As long as you have Pricing on your website your product is not open source in the true spirit of open sourceness. It is open code for sure but it is a business and so incentive is to run it like a business which will conflate with how the project is used by the community.

    Btw, there is nothing wrong with that but let's be honest here if you get this funded (perhaps it already is) who are you going to align your mission with - the open source community or shareholders? I don't think you can do both. Especially if a strong competitor comes along that simply deploys the same version of the product. We have seen this story many times before.

    Now, this is completely different from let's say Onyx being an enterprise search product where you create a community-driven version. You might say that fundamentally it is the same code but the way it is presented is different. Nobody will think this is open-source but more of "the source is available" if you want to check.

    I thought perhaps it will benefit to share this prospective here if it helps at all.

    Btw, I hear good things about Onyx and I have heard that some enterprises are already using it - the open-source version.

    • martypitt 4 hours ago
      > As long as you have Pricing on your website your product is not open source in the true spirit of open sourceness.

      It's an MIT license. That IS open source.

      If they have a commercial strategy - that's a GoodThing. It means they have a viable strategy for staying in business, and keeping the project maintained.

      MIT == OpenSource. Pricing == Sustainable. That's a horse worth backing IMO.

      • WhitneyLand 3 hours ago
        Exactly, if everything looked too good to be true and there was no transparency or hint of a business model it’s actually less attractive for some who value predictability.
      • _pdp_ 2 hours ago
        You are not wrong but in most cases this is a trojan horse. It has the characteristics of a classic rugpullware.

        At the top level looks like open source but it is not really because parts (the most useful ones) of the project are not. Imagine if python was open source but the core libraries where not. You wont call this open source in the true spirit of open source. You could make the argument that at least it is sustainable because they a have now a business model. It doesn't add up.

        I prefer more of a honest take on software. There is nothing wrong to make money while contributing back to the community in some meaningful way or by simply being transparent. In fact this is the best kind and there are plenty of good examples.

        All I am saying is that when I see such projects I tend to think that in most cases they are dishonest to themselves or their communities or both.

      • zb3 2 hours ago
        > It's an MIT license. That IS open source.

        The source is available and you can do much with it, but the incentive is that this alone should not be enough.

    • Weves 3 hours ago
      Great to hear that you've heard good things. And yea we have many large (>1k+) teams using just the open-source version (something we love to see).
    • wg0 5 hours ago
      Actually - if you have bunch of VCs on your back, you can't even align with your very own user base let alone any other wider community.
  • phildougherty 5 hours ago
    Honestly surprised something like this can get funded
    • Weves 5 hours ago
      "Chat UI" can "feel" a bit thin from an eng/product when you initially think about, and that's something we've had to grapple with over time. As we've dug deeper, my worry about that has gone down over time.

      For most people, the chat is the entrypoint to LLMs, and people are growing to expect more and more. So now it might be basic chat, web search, internal RAG, deep research, etc. Very soon, it will be more complex flows kicked off via this interface (e.g. cleaning up a Linear project). The same "chat UI" that is used for basic chat must (imo) support these flows to stay competitive.

      On the engineering side, things like Deep Research are quite complex/open-ended, and there can be huge differences in quality between implementations (e.g. ChatGPTs vs Claude). Code interpreter as well (to do it securely) is quite a tricky task.

    • gip 4 hours ago
      My understanding of YC is that they place more emphasis on the founders than the initial idea, and teams often pivot.

      That being said, I think there is an opportunity for them to discover and serve an important enterprise use case as AI in enterprise hits exponential growth.

    • mritchie712 5 hours ago
      w24, those were different times.
      • koakuma-chan 4 hours ago
        Yeah that's like so long ago. But yeah, good luck competing with ChatGPT.
        • hobofan 4 hours ago
          There are many markets (Europe), and highly regulated industries with air-gapped deployments where the typical players (ChatGPT, MS Copilot) in the field are having a hard time.

          On another axis, if you are able to offer BYOK deployments and the customers have huge staff with low usage, it's pretty easy to compete with the big players due to their high per-seat pricing.

          • Weves 4 hours ago
            There are also many teams we work with that want to (1) retain model flexibility and (2) give everyone at the company the best model for the job. Every week? a model from a different provider comes out that is better at some tasks than anyone else. It's not great to be locked out from using that model since you're a "ChatGPT" company.
    • kurtis_reed 5 hours ago
      why?
      • phildougherty 43 minutes ago
        I wasn't trying to be a hater, i think it is great they got funded for this. It just felt like there are so many free options and alternatives out there that are addressing basically the same things (and look almost exactly the same) it genuinely surprised me.
      • xenospn 5 hours ago
        there's a million other project just like this one, many that are much more advanced and mature, including from Vercel. There's no moat.
        • Weves 4 hours ago
          Agree that's a lot of other projects out there, but why do you say the Vercel option is more advanced/mature?

          The common trend we've seen is that most of these other projects are okay for a true "just send messages to an AI and get responses" use case, but for most things beyond that they fall short / there a lot of paper cuts.

          For an individual, this might show up when they try more complex tasks that require multiple tool calls in sequence or when they have a research task to accomplish. For an org, this might show up when trying to manage access to assistants / tools / connected sources.

          Our goal is to make sure Onyx is the most advanced and mature option out there. I think we've accomplished that, so if there's anything missing I'd love to hear about it.

          • elpakal 2 hours ago
            Alright let's say im tasked with building a fancy AI-powered research assistant and I need onyx or Vercel's ai-chatbot sdk. Why would I reach for onyx?

            I have used vercel for several projects and I'm not tied to it, but would like to understand how onyx is comparable.

            Benefits for my use cases for using vercel have been ease of installation, streaming support, model agnosticity, chat persistence and blob support. I definitely don't like the vendor lock in, though.

    • skeezyjefferson 5 hours ago
      [dead]