Skip to main content
544

December 2nd, 2022 × #JavaScript#Performance#Testing

Supper Club × Bun with Jared Sumner

Jared Sumner discusses creating the Bun JavaScript runtime, focusing on performance, integrating tools, and potential use cases compared to Node and in embedded systems.

or
Topic 0 00:00

Transcript

Announcer

I sure hope you're hungry.

Announcer

Oh, I'm starving.

Announcer

Wash those hands, pull up a chair, and secure that feed bag, because it's time to listen to Scott Tolinski and Wes Bos attempt to use human language to converse with and pick the brains of other developers. I thought there was gonna be food, so buckle up and grab that old handle because this ride is going to get wild.

Announcer

This is the Syntax supper club.

Scott Tolinski

Welcome to Syntax Supper Club.

Scott Tolinski

Today, we have a very Special guest. We have Jared Sumner, the bun creator.

Scott Tolinski

I I was trying to think of some way to rhyme Sumner with bun And fun, creator, all these things, lovely things, supper club, whatever we got here. My name is Scott Tolinski. I'm a developer from Denver, Colorado. With me as always is Wes boss as he is always.

Scott Tolinski

And today, we're talking JavaScript run times. We're talking BUN.

Scott Tolinski

And, Jared, how's it going? Pretty good. This episode is sponsored by Tuple@tuple.app, And Tuple is the remote pair programming tool for developers who are tired of pairing over Zoom.

Scott Tolinski

This is Just an incredible pair programming tool to share your code with a lot of really, really desirable features. More on them later on in the episode.

Scott Tolinski

Awesome.

Scott Tolinski

So, Jared, do you wanna give us a little bit of before we get into anything too intense or anything, do you wanna give us maybe a little bit background of who you are, what you what you do for a career, what you what you're doing now, and and maybe how you got there? My name is Jared. I'm the The founder of Oven, and the creator of BUN,

Guest 2

BUN is an all in 1 JavaScript runtime, with a built in package manager, JavaScript transpiler, bundler,

Guest 3

really fast script runner. It takes a lot of existing tools, puts it all in one, and makes it really fast. Awesome. Yeah. We we are honestly really excited about this because I I I probably hasn't even been 8 months, and it feels like Half of our podcasts are talking about just, like, new JavaScript runtimes and, like, the Edge and, like like, what is, like, the next step Of, all of this JavaScript. So it's it's really exciting to see Bun on the scene with this type of stuff. And we're gonna hopefully, with this podcast, we're gonna So to dig into what it is and and what it's for, why did you make fun? Like, isn't is no not good enough? What what's the answer there? I was just really frustrated with how long everything takes

Topic 1 02:33

Bun is faster than Node

Guest 2

when you're when you're building something, especially on the front end side. Like, That iteration cycle time is just really slow.

Guest 2

It's to the point where, like, I was just checking hacker news a lot of the time just waiting for things to build. And, like, when something takes longer than, like, a few seconds, you sort of just immediately lose focus. You get distracted, and and then you go do the the thing that helps you when you're distracted, which is read Twitter, read hacker news, whatever.

Guest 2

To I mean, it makes you more attractive. Okay. So

Guest 3

Bun is setting out to replace, Like a couple things. Maybe let's let's go through that. I think probably the biggest one is it's replacing your your JavaScript runtime. Right? So you're most people probably right now use Node. Js.

Guest 3

So the idea is that you'll be able to run a server with BUN on it? Yeah. So you'd be able to run a server with BUN on it, and you

Guest 2

Indeed. Butting also exposes a built in transpiler API, so you can use BUNT's transpiler.

Guest 2

But BUNT, actually, on every single file in BUNT, It'll run in the runtime, it runs the transpiler too. And that's how it's able to do, like, built in TypeScript support for for transpilation, just removing the types, and also things like it has a built in plug in system.

Guest 2

The API for the plug in system is very similar to ES builds. I almost just directly copied the API, and that lets you have, like, override, I like to have your own transpilot plug ins.

Guest 2

And then you can also have macros too, which which which have access which have a small amount of access to the AST, and it's all done the fastest way possible using direct integrations with the the engine, the the the the transpiler and all of that is written in native code, but, it has very fast, JavaScript bindings. Okay. And is that using v eight under the hood that you are binding to? Button uses JavaScript core, which is the JavaScript engine used by Safari slash webkit. Okay. Mhmm. Yeah. Interesting. This was something where I experimented with a bunch of of run times before choosing JavaScriptCore, and It just consistently on the benchmarks I ran, JavaScript Corp was the fastest to start up, while also having a really fast JIT.

Topic 2 04:10

Bun has a plugin system like Esbuild

Guest 2

And their team is just really good at optimizing the engine. And not that v eight isn't it, but v eight is obviously great too. It's really cool just like I often tweet about the various performance improvements that the the web kit slash Safari team does, and it's really cool. Like, they made a recent one was They need, a string that replaced, like I think it was 2 times faster. It might have been more than that, in a bunch of different cases.

Guest 2

And it was like this really crazy optimization where, basically, they can detect the the And the the the will actually look at the the constant string you're you're replacing, and and it'll affect, like, constant propagation, in the the It helped the different, I forget what they're called.

Topic 3 05:25

Safari team optimizes JavaScriptCore engine

Guest 2

The the the JS nodes, basically, how they're represented, and the result is that It can it can basically it's basically like memoizing it, sort of,

Scott Tolinski

but automatically. Yeah. Without user User input without the user having to do anything. And it's gotta be great to just I mean, for users, the average user is not going to notice the difference. Right? They're gonna fire up and then just be like, alright. Safari performs better or or any browser that's using or, you know, there's I think there's Tori is, like, the the web, the web app or the application framework for for this type of thing. And, like, nobody's gonna necessarily know what's happening, but they're gonna the web is being faster and developers get that for free. So, yeah, that stuff's really great to hear. I actually had no idea that, BUN wasn't using VA. I I I just figured it was for some reason, so that's interesting. That's actually really interesting because,

Guest 3

I forgot who it was you were talking. The folks from El Gallia because we we talked about WebKit a while back, and we're like, yeah. Is is anybody other than Apple use it? And Some the folks from El Gallia came out and said, yeah. Like, we we work on it. They we put this thing in the PlayStation. Like, it's it's a pretty big, thing, and it needs to go everywhere. So I was kinda excited to see that. Do you know like, do people outside Of Apple work on JavaScriptCore as well, or is that more like a WebKit thing that you sometimes get outside contributions?

Guest 2

They have some outside contributors.

Guest 2

And and also some of the the people who work on WebKit now, like, at Apple originally were just outside contributors.

Guest 2

Oh, that makes sense.

Guest 2

And PlayStation, in particular, is a big outside contributor. You can see on on GitHub, there are, there are Windows builds of web kit that are done by the PlayStation team.

Guest 3

Oh, interesting.

Guest 2

And and in general, I think there's I think there's also, like, a refrigerator company.

Guest 2

I'm I don't quite remember. Also, a lot of the the the readable stream and writable stream, like, the web streams' APIs are have And in in the copyright, like, the the camera company.

Guest 3

Really?

Guest 2

This is really random. I don't actually know the story there. Because The WebKit is effectively like a monorepo.

Guest 2

And so so, like, Bun can directly use code from WebKit, For, like, various web APIs, like, one example of that is, like, readable stream and writable stream. Most most of the code is just directly copy pasted from WebKit, Seam for, like, URL, and a a lot of the DOM event stuff, like, event target. It makes it much easier for Bun to be But, like, like, Web API compatible because Yeah. We don't even have to implement the actual Web APIs. We can just use it from directly from our browser. Oh, yeah. I think that that tracks for me because, like, when I think about

Scott Tolinski

projects like Bond, I think how in the world, like, can can there be all of this Feature, feature completeness with a tool like this that's just getting started from scratch essentially now, but it it's Pretty amazing that you have additional help. We're, kind of, odd question here, but how did you come up with the bun name?

Topic 4 08:51

Bun name comes from creator's friend's bunny

Guest 2

The whole bun and other thing. That's it. Bun, I I so I didn't come up with any of the names. Bun was originally named after A friend of mine has a bunny who who she named Bun. And,

Scott Tolinski

Great name.

Guest 2

And and at first, I was like, well, I I don't really wanna name this after your bunny. But then I thought about it more, and it, like, it makes sense because it's, like also because I think, like, cute names and logos are good.

Guest 2

But Yeah. In in BUNT's case, it's like because BUNT is both a a bundler, and also a bundling of of many tools in the JavaScript ecosystem.

Guest 2

So it's, like, good in both ways. And then I thought Oven just sounded good.

Scott Tolinski

Our lead investor suggested it. Yeah. Especially because you already have the bun thing going down. You get Yeah. I I never put the bun and bundler or any of that together, but that also tracks. Oh, yeah. Yeah.

Guest 3

That's that's awesome. So, I gotta I wanna talk about, like, node compatibility and, I guess, like, web API compatibility because this is kind of we've talked about this a little bit on the podcast is, there's, like, this sort of, like, new wave of of JavaScript. Right? And, part of it is, like, let's not just write node apps. Let's write it towards The the fetch API and the streams API and stuff that's just, like, web standards. And then there's this other thing is, like, well, we kinda spent the last 10 years writing stuff with node modules, and that'd be nice to support that. So, like, like, where are we going with that? Does does BUNN support node stuff, or are we are we trying to Focus on the winter CG stuff. BONE is trying really hard to focus on on,

Guest 2

being node compatible While also supporting web stuff really well. Basically, I think, people shouldn't have to rewrite their code.

Guest 2

And and I think it's like I think, like, the the the bar for having to rewrite code is It has to be so much better before it even remotely makes sense. And even then, it almost never makes sense. So, BUN really, really It tends to be a drop in replacement for for for Node, but I think, like, the future is closer To what looks like web APIs then and where, like, most of JavaScript should just run the same or or, like, It should run successfully on many run times and many environments.

Guest 2

And but there's just this gigantic ecosystem. Like, I think in the world. I've I've heard that Wow. A few times, at least by, like, package count. And and I think it it'd just be crazy to not support it. Yeah. It also seems like pretty instrumental for its success to be able to,

Scott Tolinski

you know, be a drop in replacement to that regard where, Like, how many people are going to be able to start fresh from anything or or rewrite major swaths of things to, support something.

Guest 3

So, yeah, that's interesting. So, obviously, Bun's a JavaScript runtime. It's a transpiler, so you can take your TypeScript and output, regular JavaScript's a task runner.

Guest 2

Are you going to be touching any of the linting formatting space that's sort of ESLint prettier right now? I we have no plans for it right now. Okay. I think this is honestly a personal bias, and that is I tend to not use linters very much.

Guest 2

So so I have, like, fewer opinions about them.

Topic 5 12:12

No immediate plans for linting in Bun

Guest 2

I my the my I use Prettier a lot. I do think Prettier is great. I I think I think it was, like, my 2nd job.

Guest 2

I I would half the PR comments were were always, like, formatting issues and, like, typos and or and, like, and all of that just, like, just it's just a colossal waste of engineering time. Yeah. Yeah.

Guest 2

And and, like, That is a a perfect example of where tooling has a big impact. I just don't feel like formatting time is is is enough of, like, a a big, Like, it's it like, Preter definitely has a performance issue when it's, like, larger files, but I don't feel like it's it's enough of an issue relative to, like, transpilers or bundlers or runtime or any of the other things. Yep. I can see a stronger case for a linter. But I think Before bun before it makes sense for bun to do something in that area, I think we need a much better editor integration, than what we have right now.

Guest 2

That needs to be, like, part of it where, like, the the editor is integrated with the runtime. Like, for example, we should I think it would be really cool to do, like, automatic type generation with, like, the plug in system. I I don't know exactly the specifics of that. But That's awesome. And so with your node support,

Guest 3

do we still use NPM, or Do we just require something and you, NPM, install it? Like, where does is NPM

Guest 2

even a thing in here, or does it just reach out to the repository on its own? Actually, the the version of Bun we're about to release, yeah, I think in the next it's most likely gonna be today. It might be tomorrow. It does automatic N npm install, but not using npm. Using Bun's, package manager.

Guest 2

And it'll and it just works if you import, like, you import Lodash, if you import React or or whatever and there's no node modules folder, then Yep. It'll automatically install it and use a shared global cache.

Guest 2

And this shared go global clash means that instead of installing node modules repeatedly for every project, you only install it once. So you save a lot of disk space. You save a lot of time spent installing.

Guest 2

But then if you do need to do if you need to, like, debug your your dependencies because there's some issue Or or whatnot. You can still use node modules. So BUNT has full support for the regular, like, node modules folder because it really started out before it was a runtime, it was a Front end dev server, node module resolver, and transpiler. So all of that is kinda is just baked in. So so, yeah, you can use node modules, and, there's no, like it's designed to work out of the box with that, so, like, you don't need to, like, change any of the The import specifiers or anything like that. Okay. Yeah. What about what about, like, common JS support? Is it just, JavaScript modules or it's both? Cool. And and on the common JS side in particular, the way that works internally is it actually transforms if you have common JS, it'll transform it into to ESM, and that just runs automatically.

Guest 2

But then, it actually has a special thing that makes, like, internally, the Common JS has be becomes synchronous ESM, which isn't part of the spec, but that's that's how that's, like, a good way to make it the the basically, The asynchronousness of ESM when it's unnecessary ends up causing applications to load a little bit slower Because you have all these extra microtask ticks. Mhmm. Yeah. This this way kind of lets you use ESM internally as an implementation detail that's, like, not visible to the user, But it avoids that overhead of all the microtask ticks. Wow.

Guest 3

I'm I'm just looking at the docs here, and I noticed you have support for HTML Rewriter, which is Something from Cloudflare Workers, which allows you to basically intercept a request, fiddle with it, and and send it along the way.

Topic 6 15:45

Bun optimized WebStreams API

Guest 3

So did you have to, like, reimplement that yourself, or is that something that Cloudflare makes available?

Guest 2

So they have their, LOL HTML parser.

Guest 2

That that's open source for from Cloudflare.

Guest 2

And so Okay. And that's the same one that Cloudflare Workers uses.

Guest 2

So I just implemented the same bindings, like matching their API and copied a bunch of their tests, and I just like, it's so is a global.

Guest 2

It was kinda random, honestly.

Guest 3

It's a cool API, honestly. Like, I I've used it a couple times to I I have a couple software as a service, and they've removed Features.

Guest 3

And I was like, well, screw you guys. Like, I I have a domain name. I can I basically just wrote a little bit of code in between the software service and myself and And added those features back in? It's it's it's beautiful API. If is that where you see BUN being used is In, like, an Edge location? What what are your thoughts on serverless Edge, all that stuff? I think BONE is gonna be really good for

Guest 2

applications that Do lots of server side rendering and do and and eventually edge as well. But I think right now, server side rendering and, and APIs are gonna be more of the focus And and also CLI ops. I think those 2 categories right now. The on the server side rendering side of things, for for server side rendering react, It's more than 3 times faster than node. Yeah. It's, the it's, like, 44 times. 44 node 5 times.

Scott Tolinski

And so it's it's a number 9. Yeah. Where does that speed come from? You know?

Guest 2

It's like it's a few things. I I spent, like, basically, a whole month On making React fast in BUN and, and also in general, making streaming, web streams faster fast in BUN. The 3 things basically are BUN has a slightly faster JSX transform.

Guest 2

That JSX transform improves rendering performance for React by around 30%, And that's just built in when you have it in production mode, but it has a really fast HTTP server. The it's to the point where when people benchmark BUN using, Node. Js based HTTP clients. The the the bottleneck is the the benchmarking tool and not BUN And in most cases, if if their proof throughput is high enough. Yeah. And the third thing is that BUND made a bunch of, optimizations for That for the actual WebStream's implementation, most of it is around, like, moving all the the code for, like, queuing data Into into native code and and a lot of there's a lot of overhead in the in the default web streams API that Bun had to, like,

Guest 3

Work with. So a JSX transform, did you just write that yourself

Guest 2

to convert it to Yeah. The the j the JSX transform is, Essentially, all it does is it in lines what what React itself does. Like, when you call in the the newer JSX transform, if you do, That JSX function Yeah. It just has a it just returns an object in a specific format.

Topic 7 18:41

Bun has custom JSX transform

Guest 2

But it turns out most of that can be done, by the transpiler, if you just and just return object literals.

Guest 2

And then in the case where it doesn't where can't do that. It has, like, a a separate function that it runs.

Guest 2

But, yeah, that was a pretty good performance improvement.

Guest 3

Oh, and the the TypeScript stuff, so bun just strips types. Right? Is is that what you It just You do? Okay.

Guest 3

Says nobody nobody except for Microsoft has created an actual, like, TypeScript type checker.

Guest 3

Do you foresee any future where somebody will will be able to write that? I understand it's a massive

Guest 2

It's a massive project.

Guest 2

Yeah. I think the version that is likely that somebody will do, is not exactly a TypeScript, Check type checker, but I think that something that is interesting is the the types the new type annotations proposal. Alright? It used to be called types as comments. Typescript comments. Yeah.

Guest 2

And I think the the long term there, maybe what that will open up is a TypeScript alternative where you could have maybe 80% of of or 90% compatibility with TypeScript itself, and then, you have some differences otherwise.

Guest 2

But You sort of, if you're if you're not a 100 if you don't need to be a 100% compatible, then you can change how it works a little bit and change some of the rules, and you could have something that's a lot faster and a lot simpler.

Guest 3

Uh-huh.

Guest 2

And I think, That would be interesting, and I think that's also pretty plausible.

Guest 2

But, of course, the challenge there is you like, Probably a a lot of why people use TypeScript is the really good editor integration and editor tooling.

Guest 2

Yeah. So it's really It's not as much like a language problem or like a even like the type checker itself is not the hard part. It's making it so the developer experience with all the tooling is really good.

Guest 2

And that's just like a huge investment. You just need a bunch of people working on that.

Guest 3

Yeah. No kidding. I'm I'm often just blown away by Even if you look at the Versus code release of every single month, what they're putting out, it's like, how many freaking people are working on this thing? And it's massive.

Topic 8 21:14

Recreating VS Code takes massive effort

Guest 3

That's a big thing.

Guest 2

They do such a good job. Yeah. Yeah. They do. This episode is sponsored by 2PL.

Scott Tolinski

2PL is the screen sharing tool dedicated to frictionless remote pairing.

Scott Tolinski

We all know that the Things that are a huge pain when trying to pair over Zoom or or Google Hangouts, like a a high latency sharing or Really bad resolution or just in general clunkiness in the u watch. I just wanna show you this code and have you look at it while we work through this together.

Scott Tolinski

Well, with Tupelo, you get access to 5 k screen sharing without destroying your CPU.

Scott Tolinski

Zap is extremely performant.

Scott Tolinski

Also, you get the ability to have very low latency screen sharing as well. So just without you noticing, you can get Five k screen sharing and low latency so that way you can make sure that everyone's on the exact same page. So if you want to try this tool out, Head on over to tupold.appforward/syntax.

Scott Tolinski

That's tuple.appforward/syntax, And give it a try today. This thing very well may solve all of your pair programming woes. Let's talk about Zig. What the heck is Zig? And and how many times have you been asked that question since, releasing BUNT?

Guest 2

It's it's a a lot. I I think I mean so okay. So so Zig is a really fast, programming language, really low level. Yet you have to Manage memory all yourself.

Guest 2

There's no garbage collector. There's no bar checker. It's just it's just up to you.

Guest 2

And This is it's really good when you when you need to understand everything that's going on in a program, because there's no hidden behavior.

Guest 2

So, for example, if in in c plus plus, it's really common to have constructors and and destructors that run automatically.

Guest 2

I think a JavaScript version of that kind of would would be like if you had, try catch and then finally everywhere, and that finally could just be inserted basically everywhere.

Guest 2

So you really it it makes it really hard to know What is actually being run when you call a function or you you create an object or what whatnot, Enzig doesn't have that.

Guest 2

And that ends up being, like, a really good thing for performance because you know everything that is going on.

Guest 2

And, also, it has this, it it sort of it sort of actively discourages allocating memory.

Guest 2

And and, so and what I mean by that specifically is, they make it so every time you allocate memory, it could potentially throw an air, and you have to handle that air, which makes it really annoying to allocate memory. So then you have this, so so you sort of are encouraged to do static allocation, which basically means, Getting all the memory you need at once, and then not using very much of it.

Guest 2

So so it sort of encourages you stylistically, and also through through its other features, To to write fast software. That that's wild. And, like, how did you find out about this? How do you how do you learn this type of thing? I the first time I saw it, Zig was on Hacker News. It was somebody had posted their docs, and I just read the entire thing, like, top down.

Guest 2

And it that sounds like it's, like, really complicated, but it's it's really not. It's it's not a huge, because the language is really simple, it doesn't have a ton of syntax. You can just read it in one sitting.

Guest 2

Yeah. And then, also, it's all 1 page. I really like that style of docs where you it's all just 1 page.

Scott Tolinski

Yeah. Just search it. Command f. Command f to find anything. Yeah. Well, what's history with languages. So, like, I I would assume you have a a large history of of different languages. Is is that your

Guest 2

Your your vibe? Yeah. Kinda. Well, not totally.

Guest 2

Like, medium.

Topic 9 25:15

Creator learned Ruby, Objective-C, JavaScript

Guest 2

My first language, was, Ruby.

Guest 2

The first thing I learned was, like, Ruby on Rails.

Guest 2

And then, and then after that, I learned JavaScript, a little bit. At at that point, I didn't know for Travis it very well in in that time.

Guest 2

And then I spent and and And then after that, I built a few iPhone apps, which was which is an objective c.

Guest 2

And then after that, I mostly worked as of on front end.

Guest 2

I mostly spent I spent more time as a front end engineer than, like, system stuff. A little after that, I spent about a year building a game, and it it it was building, like, a multiplayer box of gaming browsers.

Guest 2

And that was really Performance intense.

Guest 2

Like, because we had to do the rendering. We had to do, the multiplayer part. We had to do, and and with voxels, you have to fit a ton of data, into the smallest amount of into into, like, a very limited amount of memory. Because in browsers, you really can't use more than a gig of memory before the top just crashes. It it it was I spent probably, like, 3 weeks when I was working on the game just on, like, How do we unlike the the object for storing like, how we store the voxels.

Guest 2

And then because it also needed to be sent over the network, and and it needs to be, like, synced between multiple players as they edit.

Scott Tolinski

It was really hard, so I learned a lot of doing that. So is that what your understanding of, like, memory management comes from? Because it seems to me like like getting into something like Zig is pretty intimidating For a developer like me who's only ever had languages like JavaScript, which are garbage collected or, even Rust is fairly intimidating for me. Right? So, like, is that where you really got got your hands dirty with with that kind of more intense, memory management?

Guest 2

I guess why wasn't I more intimidated? I think, also, I I did I did spend a little bit of time with Go. I would say Go was, like, a good warm up. Go is still garbage collected, but you but Go, you still sort of you still think about pointers. There's, like, You you still think about bytes more directly.

Guest 2

You can you can, do, like, memory unsafe things in a way that's generally not what you can do in JavaScript.

Guest 2

But I would say that, for the most part with programming stuff, the way I think about it is it's all just code.

Guest 2

And, like, Somewhere when you do something, there's a function being called.

Guest 2

There's some maybe some of, like, assembly being generated at some point. And and pretty much everything is just, like, sugar on top of that.

Guest 2

So even if and and also it's like Not like, you can in in Zig's case in particular, the the hardest thing really is the memory management stuff. But for the most part, the language is really simple. Like, if you if you could ignore that, then, it probably has the similar it probably has a similar amount of syntax as like this. Interesting.

Guest 3

Wow.

Guest 3

This is maybe a a crazy question to ask, but it's something that I'm I'm curious about. Have you ever thought about putting bun on hardware? I know there's Esprino, which is like a JavaScript interpreter that runs in low memory and and things like that, and we've never really we Kind of, but you never really been able to run node on small hardware other than, like, a Raspberry Pi. Is is that something that could maybe happen at some point? The the Very literal answer is, like, yes. It is possible.

Guest 2

And Yeah. WebKit even has a WebKit port, WPE, specifically for, embedded devices.

Guest 2

Really? So, like, they they have a a a build of a web kit designed for this that BUNK could, in theory, use.

Guest 2

And, also, Zig is very good for embedded programming environments and, like, because of the a lot a lot A lot because of this, philosophy of of statically memory allocating memory and a low level memory management that's, like, perfect for for embedded.

Topic 10 29:18

Bun architecture works for embedded systems

Guest 2

And a lot of BUNS code itself kind of looks like you would what you would do in an embedded environment.

Guest 2

But I think it's it's like A whole different focus in something that we're just not focused on. Yeah. Yeah. Totally.

Guest 3

Oh, man. I'm looking at this w p. This is supported by Algalia, And they are showing photos of the fridge, and, I think this is a BMW console. Makes sense. I said that's what Photos of the fridge. We have I remember of the fridge. I remember back in the day that a lot of, like, infotainment systems were running on, like, q and x or something like that. It was like a BlackBerry port. I'd I'd like to do a show on that. Anyways, that's not the show, but, what about, like back to the node stuff. Package is there a package JSON file in BUN projects? I guess there can be. What what's your thoughts on that? Sort of the way I I think this is somewhat this is maybe, like, slightly controversial, but,

Topic 11 30:14

Package.json is like an import map

Guest 2

I think, Package JSON is kind of package JSON is sort of just, like, a better import map than the import maps that you see in browsers in a lot of ways.

Guest 2

And I think and that's kind of how BUNT sees it. So BUNT has very much has built in support for package. Json. You can use the exports field.

Guest 2

You can use main module, all that stuff.

Guest 2

If you if you wanna write a library and target BUN specifically, you can use the BUN export condition, And BUN will pick that up. Yeah. So, like, BUN has great support for that. And and with the new changes to the module resolution in the next version of BUN where it automatically installs NPM package, it still supports Package JSON. So the idea there is you would have your package JSON, as you did before. And if it just doesn't have a node modules folder, then your dependencies, it 1's package manager will automatically read your dependencies, install them, into the shared global cache, and use them. And and and it's intended to behave a lot like, NPM and PNPM and Yarn and all of those.

Guest 2

You and in fact, you can actually use BUN install as a separate package manager, and that will install a node modules folder similarly to to the NPM package ranges. And that node modules, it that's completely designed to work with node.

Guest 2

And there's no Yeah. Like, relation actually to BUN itself. It's just now that Bun is Bun's runtime is starting to use it too. Why can't Node just get faster?

Guest 3

That's a you know, like, why why are People are obviously creating new JavaScript run times to sort of get around that.

Guest 2

Is there anything stopping? Is it just it's just too big of a project, too much cruft? I think it's really hard. I think it's things things sort of need to be fast from the beginning, because otherwise, You basically need to rewrite everything.

Topic 12 31:45

Hard to make existing projects much faster

Guest 2

There's a lot of, for for BUN, it's like a lot of the time we spend is on how do we implement this API in a way that makes it enable it to be really fast, and and designing APIs to be really fast. It's just really hard. Like, it's possible. It's just Really hard. Yeah. Just

Scott Tolinski

that's so much momentum in one direction, so much code. What about, like okay. What's the what's the future of BUN look like In, like, short term future or long term future, like, where do you see Bun evolving to?

Guest 2

In In the in the short term, our focus right now is getting to a one point o stable release.

Guest 2

And I think that's gonna be another 3 or 4 months.

Guest 2

And then we're gonna have hosting too as a built in part of the product. And, in that hosting will work with existing frameworks like Next and remix and a lot of a lot more. In the in the longer term, the the goal is to become the default way people write JavaScript And and run JavaScript.

Guest 2

The reason why is that I think we can just make it a lot faster, and and make it a much simpler developer experience, yeah, by putting everything into 1 tool and by making it really fast. Another thing I'm really excited about is, our testing library. We, yeah, we have a built in testing library, and it's like last I checked, for small scripts, it's something like 40 times faster than Jest.

Guest 2

And if it's like If you're doing like if it's, like, TypeScript and, and it's like a lot of TypeScript because the TypeScript transpiler is really Fast and fun, it ends up being something like 200 times faster, which is kind of a big number. I don't I don't I feel so thankful saying 200, but that's what the the what I tried it last. That's what it looked like.

Topic 13 33:23

Bun testing library 40x faster than Jest

Guest 2

And and I think a lot of it comes down to just when you put everything into one tool, and those and you can make the pieces work really, really well together. Yeah.

Guest 3

Wow. That that's awesome.

Guest 2

So for Bun, like, transpiling TypeScript is is basically free, because it it could be good because it also because it always transpiles JavaScript, and the actual TypeScript part Doesn't really add much overhead at all. And and you have, like, the whole

Guest 3

Jess library API is available in BUN? So, like, if you have a Large test suite, you'd be able to move that over?

Guest 2

The the we're not to we're not there yet. Okay. We have we don't have that many The the batches implemented yet, but the that is the plan.

Guest 2

We're just not

Share