There are still people explaining to me why you need a 'system programming language' like C or Rust for certain tasks. Meanwhile, I'm playing a game of Quake on Mezzano, where everything from drivers to the actual videogame is written in pure Common Lisp.
In case you wonder how froggey managed to obtain a Common Lisp version of Quake, have a look at https://github.com/froggey/Iota.
That version of Quake is written in C and ran through a C->LLVM->CL transpiler. This is "written in pure Common Lisp" in the same sense that Emscripten applications are "written in pure Javascript".
Naughty Dog games (up to Uncharted) used to be written in a Lisp without garbage collection. The same guy then wrote the ITA flight search system. I think he may be the only productive Lisp programmer in the world.
Back in the 2000's, someone wrote a game debugging tool in Squeak Smalltalk which ran concurrently with the game (written in C++) it was debugging. It "cheated" on its garbage collection by shutting GC off. At the end of each game loop tick, it was modified so it just disposed of all objects which weren't manually marked "to keep."
That sounds like the best way to do GC in a game. Another way is to use explicit memory pools, remember to allocate from the right pool, and throw it all out at once.
General GC is really overrated I think. Automatic retain counting is nicer because it's deterministic and you never have to scan memory. Compaction can be nice but you can live without it.
I consult for a company that uses Common Lisp to design state-of-the-art network router chips. An e-commerce system that I wrote in CL powers my web site.
That depends on your definition of "written in." The transpiled Lisp code isn't at all what a human would or probably even could write. Don't forget about all those optimizations that get directly translated.
Its not unlike the 90's disasms. They compile. They work. But no human writes that kind of code that results when disassembling an optimized high level language's output.
"Someone" means a human, and that's where you are probably quite wrong.
The same point would then be that Overwatch or Assassin's Creed could be written entirely in Assembler "by someone", because a disassembler can output asm code that compiles to the exact binaries needed to reproduce it.
Yes, it can just like every other Turing complete language. Even Qbasic can do it this way. Let's get back to what a human "someone" can program in the language instead of these high level computer generated abstractions through Turing complete languages.
Waterloo Basic with all its crazy line numbers could do it entirely using PEEKs and POKEs with the logic you are clinging to. And it'll be just as unreadable as the interpreted LLVM is. And exactly as fast.
LLVM can be interpreted in Brainfuck. How far down the rabbit hole can we go before you interpret the whole thing in 1's and 0's? Crazy thought!
Try the conversion yourself and look at the results. It isn't anything like what a human could be expected to read and write. I use it often enough, and I can assure you I'd never want to be the chump who hopes to refactor it into realistic / maintainable Lisp code. What makes this useful at all is the already long-standing CLOS.
It serves its purpose. But you are over-glorifying the merits of the fact that Lisp is Turing complete like so many other languages that came after it.
Other people have already pointed out a speed penalty that I won't cover here. It works, but its not the OG... with enough time and effort, it could be. But for now, it isn't even close.
As a datapoint Quake is quite old. When it was released it used software rendering on a Pentium2. While is nice that everything from the ground up is written in lisp it sounds as if you are only achieving the performance of a P2 on modern hardware. What is the performance gap between Mezzano and an OS/application written in C/C++?
Folks are missing the point with performance. The Mezzano compiler + translating LLVM into Lisp wont be very efficient. That’s not the point. It’s crazy that a game like that runs on a non-Windows non-UNIX machine that is written from the ground up in Common Lisp, by way of compiling an application into a bunch of Lisp source code first.
I built it (circa 1996) on HP PA-RISC workstations, like HP-715's. There was no assembly code for those, just portable C. The frame rate wasn't great, but it played.
Both Quake 1 and Quake 2 ran fine on my Pentium MMX (and i guess regular Pentium since they didn't use the MMX stuff) using the software rasterizer and Quake 3 also ran fine on my Pentium 2 with a GeForce 2.
The reason it is slow is not because it is written is Lisp but because of how it is implemented, primarily because it is a hobby project. Graphics are rendered on the CPU for example.
I don't think it runs Linux, there is tons of embedded OS's.
It could be Nuclus, RIOS OS, or most likely, some sort of proprietary OS which was shipped with specific MCU they are using.
OS writing is pretty simple, as long as you do not need a ton of device drivers and networking. There are over two dozen of them listed there: https://en.wikibooks.org/wiki/Embedded_Systems/Common_RTOS and that is not a complete list.
My Samsung washing machine runs on (I believe) an mc6800 variant (not mc68000). It's admittedly not a surf-the-net/watch-tv-while-you-watch version, so no Quake, but on the upside, no viruses.
Curious, is this your main machine or a hobby machine? Regardless, I remember reading about this years ago (linked elsewhere for previous discussions on HN). Has it advanced to being more usable since? Interested to try it.
"Running on real hardware:
The hybrid image can be burned to a CD or dd'd onto a USB drive, and booted directly.
It requires a 64-bit x86 machine, 2GB of RAM, and a PS/2 keyboard and mouse."
I wonder if they can use this approach to convert and generate tons of tools for this OS. Build a POSIX layer, convert tons of useful unix tools using LLVM.
LISP has been a system programming language. Still is.
People just don't think about it much but forget that for a time, LISP was the shit in computer science, we invented half of the modern desktop on LISP machines!
I was under the impression that Smalltalk contributed more to what we think of as “the modern desktop” than LISP. If anyone has the real story, I’d love to hear it.
I don't have the real story, but there was a lot of cross-pollination between smalltalk and lisp. It can be challenging to determine which things originated where sometimes.
IMO a lot of it is just that all of the ingredients for a modern desktop were there for anyone with 10s of thousands of dollars of hardware, so there was both cross-pollination as well as parallel evolution of ideas.
The invention of the sewing machine is kind of like this; there was a point at which we were very close to being able to make a sewing machine, and several people solved a subset of the problems; the union of that subset gave us a sewing machine, so singling out a single inventor is somewhat non-sensical.
Most lisp machines were specifically targeted at developers while smalltalk at Xerox was more targeted at office productivity. Interestingly enough Xerox hired the BBN team that made lisp machines, and the Xerox star was made to run lisp as well.
> Most lisp machines were specifically targeted at developers
A bunch of Lisp Machines were directed at production, not developers. From Symbolics there were special models for that and also smaller "delivery system" versions of the operating system.
For example American Express had a bunch of Lisp Machines as checking complex credit card transaction, Lucent had Lisp Machine nodes in a network switch, NASA had Lisp Machines monitoring video camera streams of rockets from Space Shuttle launches, Symbolics sold many graphics systems to TV studios, etc.
> while smalltalk at Xerox was more targeted at office productivity.
Is that true? Xerox had an office system running on the same hardware, but written in Mesa - unrelated to Smalltalk. Xerox had the CIA as a main customer for the Smalltalk machines, using an application called 'Analyst'...
'But our big commercial involvement was with the CIA.'
> Interestingly enough Xerox hired the BBN team that made lisp machines, and the Xerox star was made to run lisp as well.
BBN made a Lisp in the 60s, from where some developers went to Xerox in 1972/73. Before Lisp Machines existed. BBN Lisp was renamed Interlisp and was supported by both companies. Xerox developed Interlisp-D for Lisp Machines and BBN put Interlisp on an internal research computer.
>I don't have the real story, but there was a lot of cross-pollination between smalltalk and lisp.
Smalltalk was influenced by Lisp, and in turn, Smalltalk influenced the subsequent Lisp systems; that's where OOP support appeared in Lisp, first with LOOPS and MIT Flavors, ultimately with CLOS on ANSI Common Lisp.
Common Lisp also is "image-based development", just like Smalltalk. This is a direct ST influence.
And still I haven't seen any language more Object-Oriented than SmallTalk. Sure, Python, C++, Java, etc. support OO to some level. But in SmallTalk everything literally is an object, which means you can achieve the same functionality of Lisp macros in SmallTalk as well. In Lisp you have data = code, in SmallTalk you have data = program (which is running). In fact, everything is always running in SmallTalk. I'm more than eager to learn more and more about SmallTalk. It's one of those languages with valuable philosophy behind its design (just like Lisp and Haskell).
It's more useful to think of SmallTalk as message-based than object-based. Messages are underrated, but objects are just function calls except the first parameter is on the left.
> in SmallTalk you have data = program (which is running)
Isn't it just an eagerly-evaluated language with closures? Can it walk its own AST?
If you haven't yet, I'd recommend reading the "Blue Book" (Smalltalk-80: The Language and its Implementation) by Goldberg. It has a very holistic approach and, as such, is one of the most accessible and pleasant books describing a computing/programing environment one can find.
I also recommend reading http://www.mirandabanda.org/bluebook/bluebook_imp_toc.html once you're done with the blue book; the VM is a two weekend project in any compiled high-level language (I used Rust) and it's not hard to make it performant enough for real use. I think my timeline was ~2 days for the initial implementation, 2 days for fixing up small bugs (mostly in primitives), and then 2-3 hours of minor tweaks to get the VM performant enough to be comfortable.
BTW, somebody typed the VM source in from the blue book and posted it online. You may find the resources on that page useful: http://www.wolczko.com/st80/
I was under the impression that the pioneering Xerox systems were written in Mesa, which was not even object oriented. I'd love to hear the real story as well.
When you think about it, Lisp is perfect for their use cases. Pushing patches live without rebooting and easy modification of a running system image while running.
'The objective of EMPRESS (expert mission planning and replanning scheduling system) is to support the planning and scheduling required to prepare science and application payloads for flight aboard the US Space Shuttle. EMPRESS was designed and implemented in Zetalisp on a 3600 series Symbolics LISP machine. Initially, EMPRESS was built as a concept demonstration system.'
'NASA used Symbolics' high-definition technology to analyze
HDTV video images of the Discovery launch in real-time. This high-definition system enabled NASA engineers to
get an instant replay of critical launch systems.'
Since he did AI work in the 50's, probably a lot of his work was written in LISP or IPL, which had many of the concepts that LISP later used. Most of his high-profile inventions (Computer mice, Hypertext, Interactive Computers) were first commercialized on LISP machines in the 80s. LISP Machines also pioneered some other concepts such as windowing ystems, graphic rendering, modern networking, garbage collection, etc.
With the main site https://multicians.org/, being full of nice histories, including how much safer Multics was versus UNIX, thanks to PL/I strong typing.
I was going to suggest to anyone interested in VMS to head over to the deathrow cluster but unfortunately it's now dead. https://deathrow.vistech.net/410.html
Interesting. I remember Jon Bentley [1] mentions BLISS multiple times in his book Writing Efficient Programs.
The book is a gold mine of software performance tuning techniques (with recommendations on when to use them or not), organized in the form of many numbered and well-named "rules" (so that you can remember when to apply them) about saving time or space, or trading space for time or vice versa, all with plenty of examples and "war stories". The book also talks about working at many levels of the stack (although most of the focus is at the level of program code), all the way from hardware up to algorithms.
I took a brief look at the BLISS language, and I have to say that it does seem a bit verbose, though in general, in the larger scheme of things, that may not matter, given other qualities of a language, or may even be an advantage over more cryptic or concise languages.
Every time I see a new release of Mezzano mentioned on HN, I smile! I only play with Messanno using VirtualBox but one day I would like to get it running bare metal on one of my old laptops. There are a few show stoppers for using Mezzano as a main driver for getting work done, like the lack of SSH, but in general it is fairly complete. Cudos!
I'm perfectly fine with that and can live with it.
But why oh why the terrible window decorations with beveling, gradients, rounding and shiny bubble buttons which do not fit to that?
I'm quite fond of the Everything in Language X projects. In this case, how does Mezzano boot a CL runtime? I'd expect this to be a combination of assembly and C, but if they have CL all the way down somehow that'd be really friggin' cool.
As far as I could tell looking around the project a bit, they configure the boot sector of a USB/CD/DVD/ISO to jump directly into a pre-built lisp image, no assembly/C code required. At the very least least, the paging system, IO, GC definitely seem to be written in pure Lisp.
They do compile the initial Mezzano image using SBCL, which is partly written in C, but it seems that they don't depend on any SBCL/C code on the machine that they control.
I'm definitely not an expert though, I may be completely wrong about this :)
Many (most?) CL implementation compile to native code, so you "just" need to have the boot part not to use any part of the runtime. You will have to use lower level functionality; I don't know if there is anything in the standard, but many implementation do offer them
Emacs isn't a CL program, but mezzano has an emacslike editor.
I don't think SSH ever had an implementation in pure Lisp. Telnet for sure is there, but I don't think SSH is. You theoretically could try compiling openSSH into LLVM-IR and then into Lisp, the same way Quake was compiled, though...
That’s very nit-y. I’ve used several different emacs-like editors, including Hemlock, Edwin, Alpha, and Epsilon, and while I’m happy that they exist, the current state of the world is pretty much divided between GNU Emacs plus close variants, and editors that copy parts of the emacs UI but not its core extensibility.
In most cases, the key (I claim) isn’t “is it GNU or not?” but rather “can it run a large fraction of the Hugh mass of elisp?” - and unfortunately the answer is either “no” or “hopefully in the future”.
If that’s changed, please do let me know. Is there a better Common Lisp emacs these days than Hemlock? Is anyone making progress on a cl-capable guile emacs anymore? Thanks in advance.
I use Clozure CL and LispWorks, which have CL based Hemlock variants.
> “can it run a large fraction of the Hugh mass of elisp?”
Most of that runs only in GNU Emacs and less so in forks (Xemacs) of it.
Remember, Mezzano's goal is not to be a GNU or Linux compatible thing. There are already lots of that.
When one writes a Lisp system like Mezzano, it's probably better not to use GNU Emacs anyway, since the compatibility to GNU Emacs isn't important and the Emacs UI isn't that great anyway, and there is no GNU system underneath.
If it runs McCLIM, one could also port an McCLIM-based editor (also a variant of Emacs) to it and use that.
Sure it will not run the zillion lines of Emacs Lisp, but I guess that was not the main interest for this 'exotic' Lisp OS project.
Honestly, based on my minimal experience fooling with Zmacs on my Lisp Machines, a non-GNU emacs with Common Lisp underneath is likely to be much better, especially if it can use a general purpose command system like CLIM’s.
Also if one looks at MCL's Fred (-> Fred resembles Emacs deliberately), that was nicely programmable and also possible to use as dialog items in GUIs. For CL applications I vastly prefer such a flexible model (object-oriented architecture, editor windows with an optimized Emacs-like command set, reuse as components in GUIs, ...), instead of a large monolithic editor, which then needs to be used as an external program...
There was the Lisp-machines, Symbolics etc. I was lucky to be able to work on one. Then they went out of favor because of cheaper hardware that could run anything.
But I wonder, what happened to the Lisp-machine software? Should it not be possible to run it on current day general purpose hardware even more snappily?
The short answer to what happened to Genera is that it's owned by a private holding company that bought up a bunch of Symbolics IP in the 1996 bankruptcy proceedings. They made one further release in 1998, and afaik haven't done any development on it since.
Presumably there's some price at which they'd be willing to either open-source it or sell it to someone who would put dev resources into it, but that hasn't happened.
even though useful, i would start to write a bootloader :D it will introduce first to 'bare metal' code before you want to create an actual OS based upon such code.
osdev wiki is my favorite site ever :D so many interesting tidbits on all things low level spread all over it.
As a professional Lisp programmer, this is quite readable and well-structured. Lacking in comments and documentation, but I was able to infer things well enough.
defun is actually a complete sentence, contracted, as others have pointed out. Defining a function uses a noun-like keyword in some languages. For instance in, oh, Awk, the function keyword is used.
There is a def prefix convention in Lisp, as well as a define- one: defmacro, defclass, defconstant, ... define-symbol-macro, define-setf-expansion. The def-s are contracted words with no dashes; the define-s are whole words joined by dashes. All the macros named this way have a some global defining effect, and are mainly used used as top-level forms: if you see one nested in code, then that's a red flag.
Functions are often named verbs when they are not actually functions, but procedures: subroutines that have a side effect: print, put_pixel, sort, connect, ...
Functions that calculate something from their arguments and return a value are often not named after verbs, but rather nouns.
Even if you aren't familiar with functional programming, you probably know some pure functions in some languages. For instance, arithmetic ones: sin, sqrt, atan. Or how about accessors that retrieve the property or state of an object: length, position, temperature. All these are nouns.
(Sometimes pure functions are verbed, after the process that they perform to calculate the return value, such as join, catenate. sort and reverse could be names pure functions that returns a sorted sequence, though reverse is also a noun (function that returns the reverse of its input)).
Blind adherence to rules like "functions must start with verbs" is a symptom of cargo cult programming.
It seems you should learn more about Lisp—and how to read it—before criticizing it as unreadable. As it is, your comment simply reads as if you were saying, "French is unreadable because I only know English." Of course it is. So learn how to read the new language.
FWIW, noun-verb ordering of functions is reasonably idiomatic in lisp, particularly when single-dispatch might be used in other languages.
The usage of eval there makes sense because it is launching a program. It's equivalent to e.g. using system() or /bin/sh in a launcher on a unix system.
The eval here seems to be providing the feature of exposing Lisp evaluation to the end user via some UI field. It's not "I used eval because I don't know about apply or macros".
How did this unreadable gobbledyook fail to conceal its use of eval from your eyes?
While it lacks the level of comments and docstrings that I typically use (a ton), it seems at first glance like reasonably structured imperative style CL.
i hope to god they used the llvm -> cl transpiler also written by this guy to make this xD can't imagine the horror of writing this directly. awsome though. nerd points 9000+
In case you wonder how froggey managed to obtain a Common Lisp version of Quake, have a look at https://github.com/froggey/Iota.