Zed Weekly: #25

November 3rd, 2023

It's been a bit of a wild ride the last few weeks, as we've been deeply engaged in a rewrite of Zed's UI framework and an upgrade of the entire application to the new foundation.


Why now? After all shouldn't we be adding support for X, Y, Z? They call it software development for a reason. A codebase that only grows can't truly develop, and sometimes revolution is necessary. In this case, it's a revolution we'd like to be completed before we open up Zed's source.

What's emerging with GPUI 2 is really exciting. The first version of GPUI has gotten Zed to the product you use today, which we're quite proud of. The strengths of GPUI 1 are a big part of what makes Zed what it is today. But the weaknesses of GPUI 1 were holding Zed back from where it is going.

GPUI 2 carries forward the performance and reliability advantages of the first version, while radically improving the ergonomic issues that have slowed us down. Any contributor to Zed will need to learn this framework, so the return on investment in ergonomics is high.

I'll share some of the key ideas.

In GPUI, all application state is owned by a single object called the AppContext. Models are one kind of state. Given an AppContext, you create a model as follows:

let cx: &mut AppContext = todo!("keep reading");
cx.new_model(|cx| Document {
    path: "/journal/2023/11/3.md".into(),
    text: "I'm tired, but also inspired!"

It can be helpful to package this up in a constructor function, as follows:

struct Document {
    pub fn new(path: Option<SharedPath>, cx: &mut AppContext) -> Model<Document> {
        cx.new_model(|cx| {
            let text = Rope::new(cx);
            text.subscribe(cx, |this, text, edit: &Patch<u32>, cx| {
                for mention in self.scan_mentions(text, edit, cx) {
            Self { path, text }

In the above code, when we create a document, we create a rope to store its text, which is also a model with a similar constructor taking a context. We then subscribe to edit events on the text, which are expressed as "patches". We call the scan_mentions method, then present a toast to the users if the edits contain a new mention.

A Model<Document> is similar in some ways to reference types that come standard with Rust like Box or Rc. The difference is that models are stored within AppContext, which if you squint, is a bit like an application-specific heap. To dereference a Model<T>, you need to pass a context that owns its state.

When models are constructed or updated, they can interact with this context to emit and subscribe to events, notify other models that they have changed, and access other application-wide APIs.

In addition to Model<T>, there is also View<T>. Views wrap models, but they enforce that T implements the Render trait, which has a single method mapping the state in T to a tree of elements.

pub struct View<V: Render> {
    pub(crate) model: Model<V>,
pub trait Render: 'static + Sized {
    type Element: Element<Self> + 'static;
    fn render(&mut self, cx: &mut ViewContext<Self>) -> Self::Element;

Actually, render actually needs to return anything that is impl Component. A Component is any type that can be converted into a tree of elements. It also has a render method, but it moves self instead of taking self as a borrow.

/// The core stateless component trait, simply rendering an element tree
pub trait Component {
    fn render<V: 'static>(self, cx: &mut ViewContext<V>) -> AnyElement<V>;

When impl Trait in traits hits stable (Rust 1.75.0), the Render trait will look like this:

pub trait Render: 'static + Sized {
    fn render(&mut self, cx: &mut ViewContext<Self>) -> impl Component;

๐Ÿ˜ฝ๐Ÿ Thank you Rust language developers.

Finally, here's a full-fledged, if minimal example of a GPUI 2 application.

struct Hello {
    user: SharedString,
impl Render for Hello {
    type Element = Div<Self>;
    fn render(&mut self, cx: &mut ViewContext<Self>) -> Self::Element {
        let color = cx.theme().colors();
            // Flex properties
            // Size properties
            // Set width to 384px
            // Add 16px of padding on all sides
            // Color properties
            // Set background color
            .text_color(color.text) // Set text color
            // Border properties
            // Add 4px of border radius
            // Add a 1px border
            .child(format!("Hello, {}!", self.user))
impl Hello {
    fn new(cx: &mut WindowContext) -> View<Self> {
        cx.build_view(|_| Hello {
            user: "world".into(),
fn main2() {
    gpui2::App::production(Arc::new(Assets)).run(|cx| {
        cx.open_window(WindowOptions::default(), |cx| Hello::new(cx));

This transition has been intense. A bit like turning an aircraft carrier hard right at top speed. We think it will be worth it.

Dispatches from Marshall and Mikayla to follow, but everyobody is in pretty deep so the rest of us will catch up next week!


Keeping with the common theme, I spent this week helping push the GPUI2 rewrite forward. This has taken me all over the codebase, and has been some of my first exposure to Zed outside of GPUI2 and our UI/storybook code. It's been awesome to see the first pixels being rendered using our new UI components in a real Zed workspace.

One specific area of focus for me was building out the new theme system with Nate. We now have constructs in place for building up a theme based on color scales, allowing for easily building visually-consistent themes. Ultimately this new theme system will make it possible to expose more customization for Zed without having to build an entire theme from scratch.

On another note, I'm very happy I had the chance to attend the Zed Summit last week. I had a blast getting to meet the rest of the team in person and hack on Zed together all week!


It has been a wild few weeks! We all had our summit last week and we embarked on one of our most ambitious projects yet: rewriting the entire application in what we've been calling 'GPUI2', before we open source Zed. GPUI2 is a rewrite of our internal UI framework to allow us to write composable UI code that we can actually work with. We've got a new layout engine, a snazzy new executor, and a bunch of styling helpers based on the tailwind CSS library.

For me, I've been focusing on learning the new systems and helping build out functionality that GPUI2 needs and hand't yet wired in. It's been a wild ride, as almost every day had revealed a problem with GPUI2's that our application relied on. Rust's type system made this kind of refactoring possible, and we've all spent a lot of time renaming functions and swapping in new types.

We have a few large crates high up in our dependency graph:

  • project, which hooks together our core application code
  • workspace, which combines the project with all of our other UI
  • editor, which powers every single text input in Zed.

Due to these bottlenecks, re-typing Zed has been far more synchronous than we'd like, which is bad news for a 10 person team that spends the majority of our time remote. Thankfully, we have this great tool for synchronous collaboration that we use everyday: Zed! We had some incredible moments at the summit where 4 people would be sitting around a conference table, all working on one person's machine from their own laptops, fluidly seperating and re-joining to work on different parts of the build graph as they became available.

Since the summit, this practice has continued, with 2 or 3 teams working in parallel wherever they can. Without Zed's synchronous programming capabilities and Rust's guidance from it's type system, this would have been a much rougher journey.

Now, with our second week of rewriting, we've got about 1/3-1/2 half of the app completed, and are hitting the massively parallelizable part, hopefully we'll be able to release a new preview of Zed soon!

Nathan again

Thanks for reading everyone, and thanks for bearing with us as we've moved our atoms in addition to our bits these last few weeks. Very soon we'll be getting to know each other a lot better. ๐Ÿ™‡๐Ÿปโ€โ™‚๏ธโœŒ๏ธ