They didn't pivot, they completely reinvented themselves. Twice.
I loved their first cloud offering, which they sadly abandoned.
Then they launched Space, which was kinda cool, but mostly weird and raised the question "why?". Also cancelled.
Surf looks mostly cool, although I also don't quite understand it. It seems like Notion with a different twist on AI. Not sure. Since I'm fairly happy with my Obsidian + Codex setup, I'll pass for now. The good news is, this one's open source!
I'd love to know how they're financing all of this. They have been around for years and users never even had the option to drop money in their lap. Now they're trying open source. Wild ride.
hey Pietz,
have you written up your obsidian + codex setup? I'm a die-hard obsidian user, didn't like codex 6mo ago but heard it's gotten much better, so I'm very interested if you're willing to share
TIA!
I have an AGENTS.md file in my vault that provides a bit of context that this is not about coding, but about serving as my Obsidian helper. I ask it to check the .obsidian folder to check my plugins, Tell it how I like to build my presentations and the workflow I follow when writing articles.
Then I just launch codex in the terminal inside the vault, have it do stuff, while I watch the desktop app for change.
Other coding agents like Claude Code should work also, but gpt-5-codex was specifically trained on using the terminal to do everything it and doesn't even have that big of a system prompt related to coding. Works well.
Combine that with a speech-to-text app and codex blazes through to find stuff or do things for me.
Interesting! I'm very intrigued by the possibility of new forms of computer interaction made by breaking down app silos and linking data across various mediums. Something that caught my eye recently is Atuin Desktop, which seems to be a jupyter notebook style thing that runs your code natively.
The benefits of this is that I can connect this data, but in a computable medium. Does your product have any similar ability to bring code workflows "inside" the Surf application?
Basically you can supply input context and a runnable file will generate & run in your document. Useful for creating interactive charts or small applets.
Our philosophy is that if you want to update it, you should be able use your local code editor, not be stuck in Surf!
Yes, we support any endpoint that supports the completions API. And yes, Ollama might be the easiest to setup. The images should also work with qwen3-vl.
We took inspiration from analog notebooks as a tool for thought, but wanted something for multi-media. We also see NotebookLM as the closest mainstream product to Surf.
Related -- some people who have seen Surf's Applets feature have also called Surf "Jupyter for normies".
Saw it on Twitter and was interested. But from the video and demos I immediately did not understand why Notebooks and Notes are two tabs? In my mind, a Note is IN a Notebook, not some separate adjacent item...
This is a neat app. Though when looking at the files in finder it looks like while some files are stored as '.md' files the sample md files only contain HTML.
Ah that might be a bug sorry, the sample notes should also have the html extension. The notes themselves are currently stored as html (because of the metadata and the state in notes mainly, you can still export to markdown).
We need to do some work before we can also just store them simply as markdown files.
for all the screen recordings we use Screen Studio by Adam Pietrasiak. Really a all in one workhorse for everything screen recording related.
The main teaser video was made by our incredible video editor Célestin (https://www.celest.in/) who is working with After Effects, Premiere and Blender.
Yes great point. Photos are intended as a very important subset of media.
Not (yet) super usable, but for photos we have local OCR running, with Surf creating additional metadata (e.g. the link from where the photo was downloaded).
Can imagine some use cases -- off the top of my head, suggesting the right photos to embed in a note in response to a user query.
Surf is built entirely on editable WYSIWYG documents, NotebookLM's main AI is built on chat. Surf is built to be a bit more open, NotebookLM was a bit locked down for our taste.
An example I'd highlight is taking notes against a PDF.
NotebookLM will convert the PDF to simple text, and the chat responses are read only. NotebookLM also has a lot of strict walls between chat, artifacts & sources. You have to "save" responses as (read only) notes, and move notes to sources.
With Surf you can generate notes that deep link to specific pages in the PDF, and Surf will open those pages in the original PDF. You can remove the fluff you don't want in your notes. The intention is to be a little more open -- all notes are sources from the get go, you don't have to save or migrate anything.
Great question. We'd look at Obsidian as a reference.
We think there are a lot of value added services that can happen when you need servers, but they should be user aligned & optional (Mobile app, sync, publishing, backups, remote jobs, collab).
We want Surf's client to be independent & open, but offer some of these on top -- for people who want them!
Looks to my like an effort to pivot from an AI embedded Browser after the ChatGPT Atlas release into a local system Browser.
I still don't see the advantage I get for my local system? Nearly all of the actions on the demo page are doable with chatGpt in one or three interactions.
The big difference UX wise between chatbots and Surf is that Surf is built entirely on editable documents that you can mold / craft into an output (vs chat).
We actually had a chatbot, but our explorations showed that notes were a more effective in many cases!
An example of local data is that "Applets" made in Surf can be opened / updated from your local code editor, they're just HTML files.
Umm -- not being tied to ChatGPT? Like, that's huge. I personally do not consider consistently using any AI tool unless it has a local option. I've been air-quotes "paranoid" about things like this my whole life and it's served me QUITE well.
They didn't pivot, they completely reinvented themselves. Twice.
I loved their first cloud offering, which they sadly abandoned.
Then they launched Space, which was kinda cool, but mostly weird and raised the question "why?". Also cancelled.
Surf looks mostly cool, although I also don't quite understand it. It seems like Notion with a different twist on AI. Not sure. Since I'm fairly happy with my Obsidian + Codex setup, I'll pass for now. The good news is, this one's open source!
I'd love to know how they're financing all of this. They have been around for years and users never even had the option to drop money in their lap. Now they're trying open source. Wild ride.
All the best!
PS: I would have paid for deta cloud Pro ;)
I have an AGENTS.md file in my vault that provides a bit of context that this is not about coding, but about serving as my Obsidian helper. I ask it to check the .obsidian folder to check my plugins, Tell it how I like to build my presentations and the workflow I follow when writing articles.
Then I just launch codex in the terminal inside the vault, have it do stuff, while I watch the desktop app for change.
Other coding agents like Claude Code should work also, but gpt-5-codex was specifically trained on using the terminal to do everything it and doesn't even have that big of a system prompt related to coding. Works well.
Combine that with a speech-to-text app and codex blazes through to find stuff or do things for me.
You forgot about the Deta Studio and the Horizon desktop app ;)
But on your question -- we're backed by supportive investors, and try to be frugal (we've had our fair share of sardines & rice).
The benefits of this is that I can connect this data, but in a computable medium. Does your product have any similar ability to bring code workflows "inside" the Surf application?
Atuin: https://github.com/atuinsh/desktop
We have a super early form of runnable code, called "Surflets". More info here: https://github.com/deta/surf/blob/main/docs/SURFLETS.md
Basically you can supply input context and a runnable file will generate & run in your document. Useful for creating interactive charts or small applets.
Our philosophy is that if you want to update it, you should be able use your local code editor, not be stuck in Surf!
(we have work to do to make this solid)
Regarding open models: what is the go-to way for me to make Surf run with qwen3-vl? Ollama?
As far as I understand any endpoint that supports the completions API will work?
https://github.com/deta/surf/blob/main/docs/AI_MODELS.md
If I attach image context will it be provided to qwen3-vl? Or does this only work with the "main" models like OpenAI, Anthropic, Gemini and so on?
Yes, we support any endpoint that supports the completions API. And yes, Ollama might be the easiest to setup. The images should also work with qwen3-vl.
But if you run into any issues, please feel free to submit a bug report https://github.com/deta/surf/issues
Edit: fixed github issues link
I would call this a note-taking app rather than a notebook, which to many mean computational notebooks like Jupyter.
We took inspiration from analog notebooks as a tool for thought, but wanted something for multi-media. We also see NotebookLM as the closest mainstream product to Surf.
Related -- some people who have seen Surf's Applets feature have also called Surf "Jupyter for normies".
More on Surflets: https://github.com/deta/surf/blob/main/docs/SURFLETS.md
We recently introduced a sidebar (after the video was made) which has them organized as you mention.
We need to do some work before we can also just store them simply as markdown files.
Don't know too many, unfortunately. AI Tinkerers organizes a few a year: https://berlin.aitinkerers.org/
One big distinction: we aren't trying to automate people's browsing / clicking, but augment people's thinking.
This creates a different feature set / model: tabs feeding into a document vs a chat going off and automating activity in tabs.
for all the screen recordings we use Screen Studio by Adam Pietrasiak. Really a all in one workhorse for everything screen recording related.
The main teaser video was made by our incredible video editor Célestin (https://www.celest.in/) who is working with After Effects, Premiere and Blender.
Not (yet) super usable, but for photos we have local OCR running, with Surf creating additional metadata (e.g. the link from where the photo was downloaded).
Can imagine some use cases -- off the top of my head, suggesting the right photos to embed in a note in response to a user query.
Surf is built entirely on editable WYSIWYG documents, NotebookLM's main AI is built on chat. Surf is built to be a bit more open, NotebookLM was a bit locked down for our taste.
An example I'd highlight is taking notes against a PDF.
NotebookLM will convert the PDF to simple text, and the chat responses are read only. NotebookLM also has a lot of strict walls between chat, artifacts & sources. You have to "save" responses as (read only) notes, and move notes to sources.
With Surf you can generate notes that deep link to specific pages in the PDF, and Surf will open those pages in the original PDF. You can remove the fluff you don't want in your notes. The intention is to be a little more open -- all notes are sources from the get go, you don't have to save or migrate anything.
We think there are a lot of value added services that can happen when you need servers, but they should be user aligned & optional (Mobile app, sync, publishing, backups, remote jobs, collab).
We want Surf's client to be independent & open, but offer some of these on top -- for people who want them!
I still don't see the advantage I get for my local system? Nearly all of the actions on the demo page are doable with chatGpt in one or three interactions.
We actually had a chatbot, but our explorations showed that notes were a more effective in many cases!
An example of local data is that "Applets" made in Surf can be opened / updated from your local code editor, they're just HTML files.