Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Second Life on GitHub (secondlife.com)
294 points by ohjeez on Dec 24, 2022 | hide | past | favorite | 121 comments


It's been open source since 2007. Linden Lab recently moved the sources from Bitbucket to Github.

The Firestorm viewer sources are at "https://vcs.firestormviewer.org/". It's a major fork.

Technical documentation of the client-server protocol is at "https://wiki.secondlife.com/" It's out of date, but it helps.

The real progress is that the build procedure has been cleaned up. It used to require ancient versions of Visual Studio on Windows. Firestorm can be built on Linux. I've built it and submitted patches that went in.

This is just the client side. The server is proprietary. But there's a compatible server, Open Simulator. It's not a clone; it's in C#, while the original is in C++.

There are meetings for third party viewer developers. Both inside Second Life, and on OSGrid, which is a third party grid of simulators. That really is the metaverse - a federated system of 3D worlds. It's sluggish, because most of the people running regions don't have enough server power. Think Mastodon for 3D.

I'm been writing a new client in safe Rust for the last two years. I'm trying to see how much can be done in parallel. The viewer isn't ready for public use yet. I've posted videos made with it.


> That really is the metaverse - a federated system of 3D worlds. It's sluggish, because most of the people running regions don't have enough server power. Think Mastodon for 3D.

You might want to take a look at the Matrix.org approach to this: https://thirdroom.io/

Fun fact: Apparently Matrix (the protocol itself) was made after a brainstorm session on how to create an improved and decentralized / federated Second Life game :)


What's going on with the scrollbar on that site? It disappears around a second after stopping scrolling. It even disappears if you're holding it and don't scroll for a second.


I don’t have that on this particular site, but in my experience that particular defect is caused by a browser plug in that blocks cookie banners or newsletter pop ups. Try disabling them and reloading.


I had no idea Second Life has been an open source game this whole time! It might be one of the largest OS videos games of all time then


The server components which are most of the logic are not open source. There has been a long running effort to have a compatible open source server.


Opensimulator is the open source project (http://opensimulator.org/)

There are thousands and thousands of active regions and thousands of users (https://opensimworld.com/). It's not millions, but it s the best self-hosted virtual world ecosystem.


That's incredibly surprising that it doesn't have one. I was involved in the l2j and wow emulation communities back in the day, and there was just so much effort going into it.



Cool! Sorry, it's been 10 years since I last paid attention to them, so I have no idea where those communities have migrated to.


> OSGrid, which is a third party grid of simulators. That really is the metaverse - a federated system of 3D worlds.

I had no idea this existed, and have been looking for it [conceptually] for years. Thanks!


This is something a lot of people has toyed with over the years, at last since Snow Crash, where federated 3D worlds are an important part of the plot.

At the time most people thought the open ecosystem of interconnected endpoints was destined to dominate the global information storage, because smtp had outcompeted proprietary messaging systems, and the url was looking to displace localized hypertext systems and the open web looked unstoppable.

That was before the reactionary trend where gmail overtook the smtp momentum and actors such as Facebook expanded their walled gardens by absorbing or straight up acquiring forums and smaller players. Facebook wasn't unique in their business model of locking up as much of the world's information as possible into their platform, but they were certainly the most successful.

That made it the more telling when Facebook made their pivot into assisted reality public, starting with naming their walled garden the metaverse, a term taken right out of the most visionary futurists of yesteryear! Let's see if the time has come for the pendulum to swing back. It'd certainly be most fitting if the term itself gave new life to the open interconnected ecosystems to show us again what's possible with permissionless innovation.


What is a "firestorm viewer"?

Is a "viewer" a Second Life term for "game client"? And is Firestorm a particular FOSS client for the game?

If so, that's really interesting... like you can access the same shared Second Life world but through a custom client? Does it reuse existing game assets or you free to reinterpret character models, textures, effects, etc.?


A viewer is like a web browser for a 3D world. It's like a game client, but it doesn't have any game logic or content. As with a web browser, all content is downloaded from the servers. Firestorm is a third-party viewer with a sizable development organization. See "firestormviewer.org".

Quick overview:

Second Life / Open Simulator worlds are divided into "regions". Each region is 256m on a side on Second Life, and can be other sizes in Open Simulator. The world is about the size of Los Angeles. The viewer starts out by talking to a login server for a grid, which authorizes it to connect to a region. The viewer then talks to the region. There are short messages over UDP, and some bigger stuff over HTTPS.

Once a viewer is talking to a region, it asks the region server about adjacent regions. The viewer can then talk to a few adjacent regions, too, showing the user a big, seamless world. As the user moves around the world, connections are made to new regions, and connections to regions out of range are dropped. This is how the world scales. There are about 27,500 live regions today, each with its own simulator program, always on even if nobody is using it. Each simulator program takes about one CPU and under 4GB on a server. Second Life is currently hosted on AWS, with dedicated servers. This scales well; the world could be scaled up substantially if there was demand.

The always on feature runs up costs, but the business model is that users rent land. So costs and revenue stay in balance. Regions really are always on - trains run through regions where no one is around, for example. This architecture made more sense when Second Life had their own servers in a co-location facility.

The region servers talk to their direct neighbors. This allows movement across region boundaries. If a region goes down or is restarted, the viewer will show a water-filled hole with steep sides where the region is supposed to be. So the overall system is fault-tolerant.

Content is stored on asset servers, separate from the region servers. These are ordinary AWS web servers, front-ended by Akamai caches. Content includes textures (images in JPEG 2000 format, which is a pain to decompress fast), meshes (a documented but nonstandard format), sounds, etc. There's petabytes of content. The region servers tell the viewers what hashes (UUIDs) to ask for, the viewers get the content from the servers with ordinary HTTPS requests, and all this is assembled in the viewer into 3D images for the user.

Any user can upload new content. Uploading an image costs about US$0.05. Uploading a 3D model might cost $0.50. This is a one-time charge. As long as someone, somewhere, is using that content, it's kept on the servers. There's a garbage collection run every month or so.

Voice is outsourced to Vivox. (Vivox has problems. Not recommended for new designs.)

Coming soon is "puppetry". Second Life is now testing full body and face tracking, for those who want their avatars to run more than canned animations. This involves an entirely new data path by which viewers talk to other nearby viewers via a server that just forwards data. Second Life is getting VRchat features.

So that's what a viewer is. And that's what the architecture of a big metaverse looks like.


Not OP, but would like to say thank you for the exhaustive reply anyway. This sounds like an extraordinarily well-designed system, even with the obvious optimisation opportunities… sad it was never destined to really take off!


Thanks for this, inspiring what OS even in such a niche topic can do. This is much closer to the real metaverse idea. Zuck's 10 billion/y vaporware will rot in a few years while such communities will chug along.


I thought second life doesn't do VR and they made a spinoff that did? Has that changed? Otherwise I don't understand the need for face tracking :)

If they support VR now I might give it another look.


That was Sansar, which was a "game level loader" type of virtual world. You connect to a space, you wait while a gigabyte or so of content downloads, and then performance is good. It didn't sell. Usage was 10 to 100 concurrent users. It was sold off to another company. It's still up, averaging about 5 concurrent users.[1]

The idea behind "puppetry" is that not everyone will use it, but performers and people who want to be looked at will. The audience doesn't have to wear trackers. I'm looking forward to live theater in Second Life.

I've been to puppetry virtual meetings in Second Life. Someone rigged for full body tracking and with a good microphone dominates the meeting without effort.

[1] https://steamcharts.com/app/586110


Thanks for the detailed explanation! Second Life sounds really cool now. I didn't realize there was all this dynamic downloading of assets and a marketplace for it.


Thanks for such a detailed introduction! This along can be made as a blog post tbh.


https://www.firestormviewer.org/

Assets are downloaded from the servers. Users can upload and optionally gift / sell their own.


Everything in second life is sent to the client (which usually tries to cache). Eg you can even build new geometries ingame.

Viewers are "game clients" yes. The viewer is intended to mostly be a dumb receiver. The server does most lifting.

Firestorm is a fork of the official viewer. It forked about the time I played SL but seems to have grown into it's own these days.


It's a client

Assets are downloaded by the viewer from the SL servers.



> In 2022, the renewed general interest in the Metaverse translated into significant media opportunities for Second Life. Over 21,000 articles and other stories included Second Life in 2022 – nearly 5 times greater than the amount of media coverage we received in 2021.

Out of all the metaverse projects being worked on, SL coming out on top would be an incredible development


While I love me some SL, it's a sad sad statement about the current state of tech world that SL is the best we can do...did we really peak in 2002?


Yes. We had virtual worlds and an internet that wasn't yet completely broken. The Patriot Act was yet to be fully implemented, and we were recovering from the dotcom meltdown.

Good a peak as any.


This idea peaked in 2002. Second Life fills a valuable niche, and I’m glad it exists, but most people prefer videogames.


If this was done in 2022, it'd be written in javascript and funded from attention-seeking ad tech revenue, so perhaps we did?


We're in the trough of disillusionment for virtual worlds. The tech just doesn't deliver on the promise yet.


That is not the real problem. The real problem with virtual worlds is fun. Or lack thereof.

Virtual worlds are not entertainment media. There are entertainments within them, but the worlds themselves are neutral. That's true of Decentraland, Second Life, Sominium Space, and even VRchat. You have to make your own fun. Or find some place where someone else has made something fun.

The world itself is completely indifferent to you. It's not like playing a game, where the game forces you to do something immediately. It's like arriving in a new city on a bus and being dropped off at the big bus station. (Second Life's Social Island 10 is the Port Authority Bus Terminal of Second Life.)

This completely throws many new users. They want to be entertained, or told what to do. If your idea of a good time is being given access to a good workshop, Second Life can be very satisfying. Minecraft players do fine in SL. FPS gamers, not so much. If you're not very creative, it can be extremely frustrating. A majority of the population is not that creative.

Second Life has a social scene, and until you figure the place out, you're on the outside looking in. This drives many new users nuts. Some try being annoying. Then they discover the hard way how Second Life handles that. They get banned from a club by the club owner. They find that some clubs share ban lists, and now they're locked out of multiple venues. Anyone can ban anyone from their land for any reason, and Linden Lab will not interfere. Linden Lab itself does do some rule enforcement, but you have to file a complaint and wait a few days until the tiny Governance unit gets around to it. This is for situations where you'd call the cops in real life. There's no outsourced army of minimum wage moderators armed with ban hammers. It's all social pressure.

You can get a vehicle and just drive around. You'll pass houses, stores, gas stations, art galleries, coffee shops, farms, malls, vacant lots... Just like real life. It turns out that if you let people build, mostly they build familiar stuff. There are castles, fantasy areas and space stations, but mostly, people replicate suburban America.

Much of Second Life is rather banal. Linden Lab decided to build an island of unfurnished suburban houses with nice landscaping and rent them out, for people who wanted a pre-built lifestyle. After five expansions, there are now over 60,000 of those, with each user paying about US$100/year. People have parties, BBQs, and other suburban stuff. Really.

The technology is not the problem. Figuring out what to do once you have the technology is the problem.


Thanks John. SL devs should check out Roblox, plenty of fun experiences in there.


Thank you for your post, I find it very fascinating.

I used to play MUDs back in the day and they had a protocol that, in theory, could allow you to move from MUD to MUD and transfer your character. I don't think it ever got really popular though, due to all the problems you can imagine from trying to seamlessly move between completely different MUDs. This is all from memory, so I could be mistaken about specifics.

When second life first came out I think I logged onto it a few times and then concluded it was a gimmick and haven't really been back since. I think your post explains very nicely why it never drew me in, I thought I was logging into a game.


i mean, Second Life has come a LONG way since it's 2002 days.


TFA suggests that the official git repo for the Linux kernel is on GitHub. I wonder if they really did their research


Was coming here to say almost the same thing - pretty sure the kernels "de-facto home" is not in fact GitHub.

Perhaps it was just a marketing person writing this who googled "Big open source github".


Lol, thought the same when I read "CPython".. Okay? Why not mention stuff like Godot and Monogame? Those are things the OSS-Gaming community are more familiar with. And why not say "The CPython Interpreter"? As if everyone knew what CPython was.


The source is published to GitHub though, no? https://github.com/torvalds/linux

It's an easy mistake to make, the code is there and there's nothing to suggest it's a mirror.


That said, every PR receives a bot comment indicating that the GH repo is a mirror and the user needs to submit a patch to LKML. And yet there are currently 312 open PRs on the repo. I've always wondered how people manage to know enough about the kernel to be able to contribute a PR to it, yet don't know the GH repo is a mirror and seemingly ignore the bot response and leave their PR opened.


I'm surprised with the massive number of auto-close-bots on GH (combined with the kernel's development culture) that they don't just auto-close them

I've also never understood why GH allows turning off issues, but not turning off PRs


I believe this is often people who ARE in fact contributing via the mailing list -- but want a track record tied to their GH account.


People are aware the PR isn't going to be accepted; they just want to publish it somewhere.


What is the reason for hosting it elsewhere? My cursory Google searches have failed me.


The historical reason is https://github.com/torvalds/linux/pull/17#issuecomment-56546...

Perhaps some of these issues have been fixed by now, but the Linux project has been going on just fine without GitHub (git was even invented for the Linux kernel project itself even before GitHub existed) so there doesn't seem to be any reason to switch even if all the mentioned problems were to be fixed


That thread is hilarious to look back on, now that history has proven Linus was incorrect about both:

- word-wrapping (the correct approach is to indicate preformatted text in markup and allow the viewer to control wrapping for all other text)

- his own demeanor (Linus stepped down in 2018 from his role as BDFL of Linux to "get some assistance on how to understand people’s emotions and respond appropriately").


Some good old Torvalds roasting going on in that thread.


Ironically his rant about why authors should do the line-breaking

instead of your text renderer is really frustrating to

read because of all the seemingly random line breaks

throughout his comment.


Yikes. I didn't know the dude was so prickly. Glad I'm not a contributor.


With his attitude, the kernel has reached enormous market share, excellent performance, high quality and worldwide adoption. I’m sure it will be fine without your contribution.


Lol yes I'm sure it'll be fine too. It made me lose some respect for him, but he couldn't care less what I think lol.


No one knows "what would have been different", but it is very possible that the success happened not because, but despite of Linus rants. And if he would have had self reflexion earlier, I might now be able to recommend Linux to non tech people and enjoy a laptop without suspend and hibernation issues.

Just a theory.


No, Linus has a way of seeing straight through bullshit to the heart of the matter. You may not like that how he expresses himself, but a part of that is making it very clear where the bullshit is.

For example, his take on security bug prioritization. His argument is that all bugs have potential security ramifications in the kernel, therefore calling something a security bug is not useful, it's just a bug.

"Security people" want special treatment of their reports and he refused to give it to them.


I agree he is a genius in this regard, seeing the core of an issue clearly. I also think the world tolerates more from brilliant people, and this is why you in some workplaces have primadonnas. But look at some of the things Torvalds says, it is just ...intolerable. He once asked someone if he was dropped on the head as a baby because he had a suggestion that went against Torvalds'.


I don't doubt that Linus is a genius. What I wonder if he's driving away other potentially valuable contributors (not myself, but other geniuses) with his tone.

It's not like Linux never has security vulnerabilities, and if he's actively driving people away with his hostility... what happens when someone actually catches a mistake of his but doesn't want to deal with reporting it to him because he's so thorny? What happens when he decides to retire or dies? Is Linux going to fragment into petty warlordism because the only core contributors who managed to thrive in that atmosphere were equally tough-skinned and vicious? It reminds me of dictators' empires that collapse after the strongman goes away, unless another strongman can immediately take over and consolidate power.

I'm not saying that he needs to be all nice and accommodating and accept crap PRs all over the place, but there are ways of gentling correcting people and encouraging them to fix mistakes while not driving them away altogether. Some of those might've turned out to be valuable contributors with some coaching, rather than being turned off altogether.

Now I wonder how much of the abysmal adoption of Linux on the desktop is because of hostile attitudes like that. Could Linux have altogether obliterated Windows and macOS if it had a more welcoming, inclusive community that took different perspectives and user needs into account? Maybe they never wanted that, preferring a clear kernel/userspace separation? I dunno. Of course Linux is super valuable on servers and in embedded devices, but how much of its greater potential was limited by the arrogance and hostility of one man?

shrug On the other hand, look at more "community-driven" FOSS efforts that inevitably fragment and diverge from their original core mission... the modern Web is a mess (looking at you, ECMAscript), Mozilla has its tentacles everywhere while Firefox goes down the gutter, Wikipedia was taken over by a cabal of elite admins that care more about power than users, the BSDs are mostly niche now... maybe there IS an argument to be made for a strongman, dictatorial vision?

Just food for thought.


We already know what would happen, there are other projects that are more in-line with what you want and _NONE_ of them are nearly as successful.

So, while your thought experiment is probably fun, we already have the answer to it.


I do find them fun to read, but yeah being the target of one must be rough. Though it probably contributes to the quality of the Linux kernel.


He's mellowed a bit as the years have gone on but yeah, he sure is a character.


He's amazing


Linux kernel development has been coordinated via mailing lists for ages.

Linus wrote the git version control system specifically to suit the development of Linux.

GitHub made git mainstream, and they made lots of great features for the masses.

But there is no reason for the Linux project to change the way that they like to work just because of that.

Still, a copy of the Linux source tree is also hosted on GitHub. So it seems to me that everyone gets what they want. Linux get to continue their development in the way that they like to do it. People on GitHub can easily find the copy of Linux on GH, and browse the sources there. And they can create forks of the Linux sources on GH too.


But afaik the second life viewer is not exactly open source, but instead only selected teams can participate.

EDIT: the viewer code is LGPL , but forked Third party viewers are subject to particular terms

https://secondlife.com/corporate/third-party-viewers


There are terms which apply when connecting to Second Life servers.

Those terms don't apply if you use the viewer to connect to non Second Life servers, such as Open Simulator. For Open Simulator, you have to deal with a somewhat grouchy lead developer who does not suffer fools gladly. (He's helpful if you don't waste his time. He ran a test region server for me and looked at what I was sending to get something to interoperate.)

Some viewers turn on extra features when talking to Open Simulator. The main one is "varregions". Second Life is divided into squares 256m on a side. Open Simulator supports other sizes. Also, there are several competing money and asset systems for Open Simulator grids, and third party viewers support those.

This is the metaverse at the level where it actually works.


AFAIK the lack of viewer developers is the main reason why opensimulator has stalled. Even the server is supported by very few developers who still contribute, incuding Ubit. The project seems to suffer from high drama for decades.


Last I remember, they didn't want people working on both viewer and OpenSim to avoid accusations of copying from a GPL viewer into the permissively licensed OpenSim. This, obviously, makes implementing new non-Linden features nigh impossible.

Of course, this policy changed a few months after I stopped playing with OpenSim. I think RealExtend had something to do with it. I remember playing around with their viewer and being impressed, but never doing much else with it.


The internal OS politics probably doesn't help so yeah -high drama is probably the killer right there.


Is Open Simulator back in active development? The last time I looked at it there hadn't been a release for some years, if I'm remembering right.


To answer my own question, yes it does appear to be in semi-active development. At least there's a release from this year.


There's like two or three forks out there. I'm not sure how active any of them are though.


So it's open-source but not FLOSS


No, open source means a specific thing. You're perhaps thinking of "source available", which means "you can read the source but have to comply to a bunch of stuff"


Not really. A lot of folks, including myself, don't subscribe to the OSI's definition of open-source because it just doesn't match the common usage. Open-source takes many forms, and source available is just one example.


If you can't modify and distribute your modifications -it's not open source. Anyone who says different is simply incorrect.


Well you can just redefine the meaning of arbitrary words, but the result is that nobody understands your attempts at communication.

I personally prefer to be effective at communication, so I don't just use words to mean something different than what everyone else thinks.


> I personally prefer to be effective at communication, so I don't just use words to mean something different than what everyone else thinks.

And “what everyone else thinks” is mostly different from the OSI definition.


> And “what everyone else thinks” is mostly different from the OSI definition.

Sorry, I meant: "everyone else except 3 or 4 people on ycombinator"


Ask a random person on GitHub how they would define “open source”. I highly doubt they'll mention the OSI.


GitHub itself uses the OSI definition in its ReadME guide to open source:

> Many people think that Open Source simply means availability of the source code of a project, but that does only tell part of the whole story.

> The Open Source Initiative (OSI) provides a commonly accepted definition of what constitutes Open Source. To summarize that, in order to be constituted Open Source,

>> a work has to allow free redistribution,

>> the source code needs to be made available,

>> it must be possible to create further works based on it,

>> there must be no limitations of who may use the work or for what purpose (so something like "no commercial use" or "no military use" won't fly with Open Source),

>> the work must not require an additional license on top of the one it comes with,

>> and finally, the license must not depend on a specific distribution format, technology or presence of other works.

> So, you see, it goes way beyond "the source code is available", in fact, a whole lot more requirements are stated that must be fulfilled in order for a work to really be considered Open Source.

https://github.com/readme/guides/open-source-licensing

Also, GitHub's most starred repo is freeCodeCamp (359k stars - https://github.com/freeCodeCamp/freeCodeCamp). Linked right in the repo's README, freeCodeCamp defines open source as:

> Open Source Software is code that is publicly available for people to view, modify, and share.

https://www.freecodecamp.org/news/what-is-open-source-softwa...


Considering the term originated with them, it's fair to say they get to dictate what it means.


Ah, things would be simpler indeed if words could be dictated.


And they can be -that's why we have dictionaries.

Terms DO have meanings whether contrarians and the argumentative want to acknowledge those meanings or not.


Dictionaries are descriptive, not prescriptive. The meaning of terms outside of specific fields like law is based on consensus, not authority. Merriam-Webster doesn't own the English language, and neither does OSI.


If you want to look for consensus, there's no better place than Wikipedia.

> Open-source software (OSS) is computer software that is released under a license in which the copyright holder grants users the rights to use, study, change, and distribute the software and its source code to anyone and for any purpose.

https://en.wikipedia.org/wiki/Open-source_software

Citations for this sentence (many more in the full article):

- https://books.google.com/books?id=04jG7TTLujoC&pg=PA4#v=onep...

- https://ejournals.bc.edu/index.php/ital/article/view/5105


No, open source is a vague term with many different definitions. The open source foundation does not control the english language.


The open source software movement at large does not appreciate companies trying to water down the meaning of "open source" to mean the same thing as "source available" for software, because the term "open source" guarantees users additional rights.

The Open Source Initiative is one of many groups that define open source software as including the right to not only view the source code but also to modify and redistribute it. Others include

Merriam-Webster Dictionary:

> of software : having the source code freely available for possible modification and redistribution

https://www.merriam-webster.com/dictionary/open-source

New Oxford American Dictionary:

> denoting software for which the original source code is made freely available and may be redistributed and modified

https://subscription.packtpub.com/book/web-development/97817...

And the authors of all of the academic literature cited in https://en.wikipedia.org/wiki/Open-source_software


Great, I'm not a company, just a regular person. Language is defined based on how regular people use it.


Yes, and regular people who use open source software understand that they have the freedom to modify and redistribute open source software. On the other hand, there is a minority that tries to sell proprietary software by misusing a term that originated in the open source community and incorrectly applying it to software that is not open source.


Can you define what exactly open source means in that case?


The majority opinion is the OSI definition and approved licenses but not everyone agrees.


source is open as "open to read, download, and run" (and even modify as soon as you're doing it privately)?


If you can't change it and distribute your changes it's no longer open; pretty much by definition.

Maybe you're thinking of "looky loo" software or something? But if you can't change it and if you're not allowed to share your changes then it is NOT open.


Second Life has always really interested me, seems like it's got a quite the meaningful heritage behind it. If they are looking to move into the full VR/AR/Metaverse space, I can see that wisdom/prior art coming in very handy.



I wonder why Meta didn't fork Second Life to use for their 3D world rather than re-invent the wheel and in the end build a bad product.


I don't know about the clients, but I've worked with a few ex-LL people and heard nobody really likes what the protocol became, and also lots of core parts of the LSL execution model they regret. Presumably Cory thought he could do better the second time around. (And maybe he did! I'm not sure Facebook's metaverse problems are technical ones...)


One tiny design error had huge implications. The UDP messages are multiple messages per packet with the form

    [msg_type variable_length_message]
There is no message length in the message. So, to parse the message stream, the receiver must know how to parse each message type in detail, and can't skip message types it doesn't understand. Thus, no new message types can be sent until everything that receives them has a parser for the new type. And, so, there have been no new message types since about 2016.

Most game systems use

    [msg_type msg_length variable_length_message]
so you can introduce new messages and not break old receivers.


Seems like a good case for protobufs. Maybe there weren't any good IDLs back when they originated the protocol, though.

It seems like there's an easy way to retrofit the protocol without breaking it for older clients (as long as older clients truncate packets once a message type is unknown). Put a new message type at the end of the packet, with a list of message start offsets in it. By including its own offset as the last element, the start of the list is easily found. The first message offset could be omitted, since it is implicitly zero.


I am thoroughly enjoying all your contributions in this thread. I love getting a look into the architectural bits of a project/community of this scale that I don’t really know much about. Very interesting, thanks


As I mentioned previously, there's not much written about this. Which is probably why existing metaverse projects are so awful. It's mostly taking Unity or Unreal Engine, which are intended for use with carefully pre-built content, and somehow trying to make a large dynamic virtual world from those parts. The duct tape is troublesome.

The primary technical designer of Second Life did a very good job. Then he had a disagreement with management and was fired. So he went to Facebook, developed their mobile client, became a Facebook VP, made lots of money, and was semi-retired for a while.

Scaling to a big world is really hard. Big has two dimensions - area and density. The Second Life architecture scales well in area but not in population density. More than 20-30 users in a region will choke it. More are possible if most users sit down and doesn't move, because sitters are not getting physical simulation. So audience-type events work.

Improbable was working on large crowds, but their solution is really expensive to run. Which is why they just do demos of Otherside for a few hours at a time every few months. Also, they use very simple avatars and do not, as yet, support vehicles. Took them over US$500 million to get to that point. They have dynamic regions - more people, divide the world into smaller chunks. That introduces a whole range of new problems. What if I have 20 people on my boat, and a region boundary moves under the boat while I'm crossing it? Stuff like that. Fixed region crossings involving multiple avatars and vehicles are troublesome in Second Life. People do try bus tours, but sometimes an avatar gets Left Behind at a region crossing. Linden Lab's devs have been trying to fix that for at least 8 years, without success. It's a tough problem in real time distributed system design.


What's stopping them from introducing a new major version of the protocol that addresses this?

Presumably they could have a negotiation process that newer clients invoke to upgrade the message stream to a new format, and support both in parallel on the backend until an insignificant number of sessions were using the old version.


> Presumably Cory thought he could do better the second time around.

I remember a post from him where he said something like "Note to self: next time spend more than a week(end) on the scripting language". :-)


Primarily the problems with FB's metaverse are design (no legs? really?) and goals. I'm sure that on a purely technical level it's probably first rate. But as usually is the case, the biggest problem is people.


With Firestorm, Open Simulator and Termux you could theoretically host your own world in VR on your Quest. https://nwn.blogs.com/nwn/2021/03/second-life-social-vr-ocul...


This brought me nostalgia.

I tried SL in 2011. I was able to get some avatar add-ons for free. My favorite add-on was a Tux (Linux mascot) doll. Due to poor 3D graphics, the doll looked like its semi-detached to my avatar and moves with my avatar without any animation. It was ridiculously irrelevant, but I enjoyed the game.

Curious if anyone else had that same add-on.


For any users, how active is it these days?

I dabbled with it many years ago and ... Christ, can it really be 20 years since release?!?!


concurrency ranges from 23k to 45k or 49k at any given time. God only knows how many of those are bots/scripted agents though.


my second life is on Github as well :P


well said :)


That’s the project where Facebook‘s Metaverse forked from, right?


Well, one of the original founders (?) of SL went from their to FB but I'm not sure what his original role is. I doubt that there was a fork as we think of it. Everything I've heard (second hand or third hand) is that the old SL server code is really messy and wouldn't migrate over well -probably better to simply take the lessons (social and technical) learned from SL and start over from scratch.


The founder of second life was inspired by the virtual world described in Snow Crash, so both are forks in spirit from that novel.


Hmm... I read (and loved) Snow Crash, that's not how I imagined it ;)


No, but when I saw where Meta's effort had got them it struck me they might have been better putting that development work towards modernizing an implementation of SL.


Perhaps when Linden Labs open-sourced the second life stuff, ChatGPT cached it and Meta asked it to write them a metaverse.

EDIT: Well it does something...

```Sure! Here is a basic client for Second Life written in Python:

import requests

# Set the base URL for the Second Life API base_url = "https://api.secondlife.com"

# Set the endpoint for the login API login_endpoint = "/login"```...


Everybody knows that to write a client for SL in Python you juste have to write: import sl-client

Silly ChatGPT.


I wish. The Metaverse (Horizon Worlds) client is horribly primitive compared to the SL client.


The amount of GitHub contributions may have something to do with how popular Second Life is for furries, and how many IT professionals are furries.


Meta - "write that down write that down!!"


I don't know if things have changed but the last time I tried SL the client was so rough and awkward it seemed like an alpha. Even moving was laggy and annoying.


The biggest contributor of lag is not having things cached or being in a laggy region.

On first connect, it used to be practice to go to a busy scene (a mall) and then go do something in real life for an hour or so while the viewer grabbed everything and cached it.

Regions can be very low lag or amazingly laggy, depending on a bunch of things (particulary misbehaving scripts).


The causes of lag in Second Life are complicated, and are, at last, getting a lot of attention right now.

I've been working on that with a multi-threaded viewer in Rust.[1] After I started posting videos like that, there was a lot less "can't do" attitude from the Linden Lab devs. Now there's a "performance viewer" project out of Linden Lab, using many of the same techniques. Key concept: the backlog of assets to be loaded needs to be on a priority queue which gets reordered as the viewpoint moves. Otherwise you bottleneck loading stuff that you were previously near, not stuff near where you are now. There's a nice B-tree solution to this. Once you have that, everything in close-up is present and at high resolution.

Hardware helps. 100mb/s networking and putting the viewer cache on an SSD will help a lot.

There's still a big problem server side with the transient load when a new user enters a region. A mall or event with a lot of traffic can slow way down. The server devs are trying for more concurrency, but it's hard in an old single-thread 32-bit C++ program.

It's striking that, despite all the "metaverse" hype, there are very few people, even among game devs, talking about the nuts and bolts of making this stuff work. The metaverse conferences are mostly about branding, NFTs, and moderation/censorship.

[1] https://video.hardlimit.com/w/sFPkECUxRUSxbKXRkCmjJK


Zuckerberg must be kicking himself for spending so many billions on the metaverse right now.


That money is being spent on VR/AR HW and software ecosystem, not one app on it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: