I stayed at one in Iceland where the entire bathroom wall facing the living room was unfrosted glass, including the door. The living room was open concept so it was hard to avoid seeing into the bathroom. Great place for couples to bring their entire selves to the relationship.
Thank you for your feedback. The platform has an advanced search feature at https://hostingmoz.com/search
. From there, you can use the filters to dynamically view overall statistics as well as filter the search results. In the coming days, more filters and more robust analyses will be added.
Because in most languages they're not useful. Symbols are solutions to problems, some of which are:
1. mutable strings (ruby)
2. and / or expensive strings (erlang, also non-global)
If you have immutable "dense" strings and interning, and you automatically intern program symbols (identifiers, string literals, etc...) then symbols give you very little.
And then there's the slightly brain damaged like javascript, where symbols are basically a way to get some level of namespacing to work around the dark years of ubiquitous ad-hoc expansions so you're completely stuck unable to add new program symbols to existing types because you could break any page out there doing something stupid.
As the article covers, they are nice syntactically, regardless of those performance considerations. They fill a niche that in my experience actually turns out to be more common than string literals (though less common than strings as actual textual data).
I haven't written ruby (or any lisps) for awhile, and I miss symbols.
They exist in K/Q. A single-word identifier-shaped symbol begins with a backtick, or a multi-word symbol can be created with a backtick and double quotes. A sequence of symbols is a vector literal, and is stored compactly. For example:
`apple
`"cherry pie"
`one`two`three
Many languages will intern string literals implicitly, or allow a programmer to explicitly intern a string; for example Java's "String.intern()".
The problem with string interning, especially for strings constructed at runtime, is that for the interning pool to be efficient it is very desirable for it to be append-only, and non-relocatable. A long-running program which generates new interned strings on the fly risks exhausting this pool or system memory.
Personally, I found Ruby's symbols to be a source of bugs because they can easily get mixed up with strings. The article gives the example of dict[:employee_id]. But what happens if you serialize "dict" as JSON, then parse it again? The symbol :employee_id will be silently converted to "employee_id", which is treated as a different dict key from :employee_id. I found it was easy to lose track of whether a given dictionary is using the "keys are symbols" or the "keys are strings" convention, especially in larger codebases.
Yeah symbols are terrible and they lead to using Mashes or "Hashes with indifferent access" to attempt to allow both syntax. This helps with round tripping to JSON and back and getting consistent access either way, but values are still not converted. And values shouldn't be symbolized from JSON which means round tripping through JSON typically converts symbols into strings.
It would be a lot easier if symbols had been just syntactic sugar for immutable frozen strings so that :foo == "foo".freeze == "foo" would be true.
And under the covers these days there is very little difference. It used to be that symbols were immutable and not garbage collected and fast. And that strings were mutable and garbage collected and slow.
These days symbols are immutable and garbage collected and fast and frozen strings are immutable and garbage collected and fast (and short mutable strings are even pretty fast).
Symbols as a totally different universe from Strings I would consider to be an antipattern in language design. They should just be syntactic sugar for frozen strings if your language doesn't already have frozen strings by default.
Symbols in Ruby are meant to be more performant that strings iirc. If I have symbol :a, then it's allocated once regardless of how many time it appears. As opposed to "a" which is reallocated every time.
I guess it's similar to Python having a single instance of small integers. PlayStation also experimented with caching small floats which gave them some perf improvements too, but I think wasn't as performant in all cases.
Lua (and some other languages) intern strings, so all strings that are the same point to the same string instance. This gives the same benefits (plus string equality is just pointer equality) without a different type.
There is a caveat in older Ruby versions that they aren't garbage collected, so they shouldn't be used for things like user input. Not a problem since 2.2 though.
Symbols can even improve performance. Replace them with integers at compile time like a global enum, and so the runtime only needs to compare integers instead of potentially lengthy (especially if UTF-16) strings.
> Replace them with integers at compile time like a global enum, and so the runtime only needs to compare integers instead of potentially lengthy (especially if UTF-16) strings.
All of those strings will be interned, and can thus be compared by identity. Which is an integer comparison.