Hacker Newsnew | past | comments | ask | show | jobs | submit | tov_objorkin's commentslogin

I was greatly inspired by his work. After getting enough skills, I even built my own IDE with live coding and time traveling. Its practical use is questionable, and it seems like nobody is really interested in such tools.

Playground: https://anykey111.github.io

Images: https://github.com/anykey111/xehw


I've dabbled a lot in this space as well- built an experimental language that natively supported live-coding, after building live coding capabilities through LSP for love2d (Lua) to get a feel for the feature set I wanted etc

Love2D Demo https://github.com/jasonjmcghee/livelove

Language Demo https://gist.github.com/jasonjmcghee/09b274bf2211845c551d435...


Nice, the main problem is a broken state. I use immutability at the language level to prevent disaster code changes. So, the program during live coding is literally unkillable, and you can jump back to the saved checkpoints without restarts.


Yeah the language here has a notion of the "last good state" so it can keep running. In the demo I'm not hitting "save" - the moment there's a good state, it becomes the "current version" - but there's no reason it needs to be that way.

I made the decision that state management is manual - the "once" keyword. Any expression/block not using "once" is re-evaluated any time there's a change to the code. If it's using it, it only re-evaluates if you change the (depth 0) code of that once wrapped expression.


In my case, only part of the program is recompiled and re-evaluated. The rest is located in a "committed" frozen area. Users can try new changes and throw them freely. The editor performs an evaluation/rollback on every keystroke, ensuring no accumulated or unintended changes to the stated were made during editing. When the user is satisfied and hit run, a long-term snapshot is created and the source code snippet is moving to the frozen area. Thats critical because the edit also rollback the file positions and streams.


Me too, for my master thesis:

https://m.youtube.com/watch?v=HnZipJOan54&t=1249s

It was a language designed alongside its IDE (which was a fairly rudimentary web app).


Exciting stuff, thanks for sharing!


I've come around to feeling that if I'm going to make an experimental development tool, I need to make it in service of building something specific. Maybe something playful... if I'm building something "important" then it can put unwanted conservative pressure on the tool. But something, and if I do that then at least I have something interesting regardless of the fate of the development tool. Because yeah, there's a good chance no one else is going to be excited about the tool, so I have to build for my own sense of excitement, be my own most enthusiastic user.


I share a similar sentiment.

I have a deep drive to build the "important" stuff so that my life has meaning, but there's something hard to motivate about any given thing being "important" when you look at it long enough. It seems like the "important" thing I'm building eventually looks ridiculous and I bounce off of it.


Maybe this is some kind of art that doesn't need to be useful.


I think your time might be now.

One major issue with vibe coding is parsing divergent code paths, when different prompts create different solutions and architectural compromises.

Parsing that mess is a major headache, but with live coding and time travel,I bet those tools would make managing divergent code branches easier and really take advantage of branching repositories with multiple agents all working in tandem.


This is excellent: thank you for pursuing these wonderful ideas.


I wish to have the skills to explain my work as well as Bret Victor does. Editing, reverting, and committing parts of a running program feel alien to users.


Isn't that part of Paul Graham's startup lore? They were running lisp web servers for their ecommerce store and while a customer was on the phone with an issue, they would patch the server live and ask the customer to reload. Customers would hang up convinced it was their personal glitch.


The tool uses a Forth-like language with immutable data structures and persistent memory snapshots. It also uses Clojure style meta-data and compile-time meta-programming. I have no luck convincing people that a language without curly brackets is useful.


There was recently an HN post with a video of someone using a pretty cool environment that supported that kind of live-coding for creating an electronic music track -- it seemed very appropriate there, and I would guess likely to be popular.


Bold plus, making PLs is a lifestyle, not a business. Most PLs clones each other and absorb features. The only difference is QOL and tooling. Users expect to have a full set of batteries, an IDE/LSP, jobs, OOP style, and minimal effort to learn. Being popular contradicts with the idea of pushing the boundaries and shifting paradigms.


> Bold plus, making PLs is a lifestyle, not a business.

Yeah, whenever I encounter a new language, to see how serious they are, I take a look at their github commit history. Usually they are all green every day, there's a sort of obsessive compulsion behind working on these projects.

It's pretty amazing how the boundary of what a PL actually is has expanded. It's really the story of "If you give a mouse a cookie"....

Used to be back in the day you didn't even have to implement the thing (ISWIM). But if you give the people a programming language they're going to expect a compiler to use it. Then devs started expecting a whole standard library sometime after the 70s. By the 80s and 90s IDEs were all the range -- you needed to provide at least syntax highlighting for sure. A breakpoint debugger was starting to become standard expectation.

In the 90s - 00s, open source rose to prominence and communities of open source developers works to create robust community-driven language ecosystems, which then became an expectation for new langs. Quite the paradox there -- how do you create a community around a new language if the new members expect a community??

But once you have a robust package ecosystem, devs start expecting ways to manage it. So now you not only need a package manager, but also a package repository and all the issues which come with that.

Now with all these packages you also need to provide a robust build system to download them all, build each one, link the binary, and it should be compatible with all major operating systems, all major architectures, and of course the web.

Today, LSPs were the most recent "must have", before "AI integration" took over and now you need to have AI assistants that know your language and all the libraries.

All that before you even start talking about the language specifics. To be popular, your language must a) be severely limited in its "weirdness budget" (the degree to which you break from tratitional languages must be a small delta or potential users complain) b) be imperative-first c) and most importantly, be open source and charge exactly $0 for all of this.

That's why the quickest way to build a new and different language is actually to create a cult around it. If you're gonna make any money at all, it'll be in selling plushies of your mascot. I wrote a whole novel about that route here a couple weeks ago. https://news.ycombinator.com/item?id=45806741


Forth is pretty lowlevel, i don't think it can compete with the highlevel languages. Postfix notation and stack juggling is just boring.


That's what I would call the "Forth challenge", that is to grow out of stack juggling.

When you look at the Forth code of a beginner, yes it's full of stack juggling, of "@" and of "!". When you look at code from more experienced Forth programmers, there's much less of it.

The challenge is to build your way out. There's no fixed way to do this, because the best path to do so is generally dependent on the task at hand.

Needless to say, most programmers fail this challenge.


You are definitely not making it sound attractive for real projects where the outcome matters.

Sounds like a fun game maybe.


This challenge comes with rewards. Forth has superpowers that can't be found elsewhere.

But its definitely not for everyone. I'd say that the status quo of Forth being an obscur niche is fine, just fine. If you need convincing, if you aren't spontaneously curious about Forth, then it's likely not for you.


I'm in favor of everyone having fun however they want, and you have piqued my interest a bit that it might be fun.

But if even the proponents of the language say that ("needless to say"!) most programmers will be incapable of the challenge of writing good code in it... definitely doesn't sound like anything I'd want in anything I wanted to be maintainable or long-lasting. If this is the outcome you desire, then your, uh, anti-evangelism is working!


I've never written Forth as part of a team. I suppose it's difficult with an average team. But then I try to imagine what a small team of good Forth programmers could do and I'm thinking it would be a quite powerful team.

If I piqued your interest, might I interest you in the introductory series I wrote: https://tumbleforth.hardcoded.net/


Factoring is good way to reduce the complexity but writing math is painfull experience. To be fair, the infix version of Forth exists as an extension library.


C is pretty low level… and yet.


Disagree, there is no types in Forth, only cells. User is acting as a compiler. Comparing to C, imagine that every keyword like for/while/break is implemented as a macros using setjmp/longjmp. And this is a strong part of the language, flexibility but without any guarantees.


One very stupid reason is the Qt itself, graphics application can't live without the QScreen instance. If user unplug every screen from the system, in order to prevent crash qt create the fake screen and hang in the headless state.


LLVM have MSIL translator back in 2007 [1], it was abandoned die to lack of interest.

1. https://discourse.llvm.org/t/msil-backend/8480


I have backend for the Hisilicon SoC from 2016, its about 1500 lines of C code, few calls to 2D API and copypasta of the triangle rasterizer. Dunno about modern version requirements.


Oh, please, making reasonable good DSL might take a month or even years. Also wrestling with parser and borrow checker not an easy task for average user.


I'm no Rust zealot but the borrow checker is arguably one of the core benefits for the average user. Wrestling with it means they are not yet ready for systems programming and need to understand move semantics more deeply.


Borrow checker is fine. But from the library writer perspective its pain and take enormous amount of time to make it sound. One does not simple checkout "nom" and test thing in few minutes.


We're social animals, this is hardcoded part of our brain. Managing solitude require training as anything else.


Solitude and boredom is not the same.

I understand solitude. You have a need. It's not fulfilled.

I don't get boredom: I see so many ways to fulfill this need.


unwrap() or expect() unconditionally smash program execution in case of error. Properly written program should handle such errors or use unwrap_or_* as fallback. The point is that unwrap/unsafe is breaking safety rules, user writing one know what hes doing or just don't bothering.


Unwrap and expect don’t violate any safety rules! If they did you’d need to wrap it in an unsafe block.

https://doc.rust-lang.org/nomicon/unwinding.html


You're technically right, i mean if user aim for durable code then using unwrap is not desired behavior most of the time. For example pointer deference in C is implicit operation. In Rust user must choose explicitly how to handle Option/Result. Even if using unwrap not so harmful as using unsafe and didn't violate rules this is our deliberate choose to break "end-user safety".


Once i have a dream about RN application for iOS/Android without Xcode and AndroidStudio.

Started with Expo, great tool if you don't have native modules. But i have, very complex search library in Rust. After few attempts the code was converted to asmjs (RN doesn't support wasm). So i have JS-only codebase that works on Expo. Livecoding (a bit slow beause of large js codebase) and easy deploymnet on all platforms.

After release users starts complain about performance. RN fetch API doesn't support binaries, everything passed back and forth as base64. Some low-mid tier phones got stuck for few minutes. Also RN fetch doesn't support progress and a lot of other thing.

Decision was to start using native libraries. Ehh, goodby sweety Expo...

The lesson was learned, if you have complex application don't expect too much without heavy investment. JS is ugly and unpredictable as usualy. Overall experience is ok/mehh, but not great.

For the next project i give Qt5 a chance.


May want to also look at MS/Xamarin development, I know a few people that really like it. Though, RN seems a bit more popular. For that matter, in what ways is JS unpredictable? Generally combined with Redux or something similar, I've found it to be fairly consistent and predicatble.


About five years ago i have a dream, that i can abandon all this peasant C/C++ development and jump into mono/xamarin train. As a main product we have Linux ARM/MIPS karaoke machine with optional remote control via desktop/mobile apps. The idea was to use same code every where.

- Tooling for Linux was horrible, like really! - Alien mobile UI look and feel - One can not simple build mobile app for desktop. - New shiny search engine prototype doesn't fit into low tier Phone RAM. - Mono Hello world application on MIPS device with 128RAM uses about 28 mb of RAM. - 400 MHz cpu have really hard times with GC, nogo for soft-realtime. - No seamless integration with native code, using libs like ffmpeg is crazy hard. - F# was 2nd citizen in that time!

Now things changed, but still imho Linux is not a good choose for .net, also MS force you to use Azure/Cloud/etc infrastructure.


I was thinking in the context of mobile.. TBH, depending on my needs, I'd probably just fallback to an electron app for desktop at this point. Not good for lighter systems though.

.Net doesn't force Azure/Cloud at all. I know all the MS demos tend in that direction, but you can absolutely use it all internally. I've worked on several backends targeting docker and linux with .Net core at this point. Even ARM as a target is decent. GUI, no idea on that, I tend to prefer react+redux+material-ui there.


About JS, i made wrong assumption that we entered v8 era and my number crunching code should work reasonable fast everywhere.


Gotcha... depending on size/scale of numbers, Python is probably the only general purpose scripting language I would trust for that (though don't actually use Python, it's not what I work on).


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: