Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Rust 0.9 released (mail.mozilla.org)
340 points by kibwen on Jan 9, 2014 | hide | past | favorite | 146 comments


Wasn't this release supposed to finally offer a Windows package which works out-of-the-box?

I just tried it. Only got an error message about a missing GCC component. Seriously, a ~90MB download and that still does not include all the GCC dependencies?

We still have to somehow manually setup a version of MinGW compatible with this particular build? Who is in charge of the Windows port? A Linux user who cross-compiles from an Arch box? No way a Windows user would think that Windows package is acceptable.

I mean, I am not paying anything for it so it is not that I think I have the right to demand better packages but you are losing out on a massive amount of feedback from the Windows world by only offering such downloads.

You are largely limiting yourself to determined people who are already familiar with MinGW-based C/C++ development on Windows (that is a small subset of Windows devs). What about people coming from - say C#? You expect them to fiddle with a MinGW setup just to build "Hello World"?

Sorry, but for me this experiment with Rust ended just like the 0.8 one. Downloaded Windows installer, ran Windows installer, tried to compile Hello World, got error message about missing components, uninstall, delete.


  > Wasn't this release supposed to finally offer a Windows 
  > package which works out-of-the-box?
Nope! No idea where you got that impression.

  > Who is in charge of the Windows port?
We keep asking for people knowledgeable about Windows to step forward and lead the effort to port us off MinGW and onto MSVC, and none ever have. Would you like to volunteer? :)

To clarify, the goal is to be using the native Windows toolchain by 1.0, if at all feasible. Servo has to work on Windows, after all.


I just started looking at Rust and like what I see so far. I may be able to help; I worked on the Windows port of Go and put together the MinGW build environments as well as the binary installers.

If someone can point me to a list of outstanding issues/requirements and a contact person it'd be much appreciated. Note, I've already found this https://github.com/mozilla/rust/issues/8996 and this https://github.com/mozilla/rust/issues/1237.

I'm assuming the workflow notes at https://github.com/mozilla/rust/wiki/Notes are up to date?


Thanks for offering! I'd suggest that you make a post to the mailing list for maximum visibility:

https://mail.mozilla.org/listinfo/rust-dev

HN comments and IRC chats tend to get lost in the shuffle without always attracting the attention of the devs.


I personally don't know what's necessary now, but if you feel like joining #rust or #rust-internals on irc.mozilla.org (which I recommend :) ), then vadimcn and klutzy are the people to talk to. (IRC handles match their GitHub names.)


...doing it now.


Installing Go on windows was a breeze.. thanks =)


I think using LLVM's native tool chain will end up being more feasible, fwiw.


Indeed, I'd love if LLD somehow matured in time for Rust 1.0. But I have no idea what their timeframe is.


Please don't depend on MSVC. Why not simply bundle the needed MinGW parts?


Please don't depend on MinGW. MSVC is the best supported compiler toolchain on Windows. Use it.


It's also horrible, but Arelius is right, it's the best option.


What's so horrible about it?

It's a very mature toolchain, and has better modern (C++11) support than the version of G++ on my Linux box.


The Express version restricts distribution of the binaries created with it IIRC. Also it's quite a heavy download compared to MinGW.

Please keep in mind that I'm not comparing writing C++ programs with MinGW vs. VC++. Rust just needs a linker for some tasks, which should be lightweight.


No one is suggesting that it should be the exclusive supported toolchain.

It may strictly be a larger download than MinGW, but it's certainly less painful and confusing. Sure, Rust may just need a linker, but MinGW is anything but lightweight.


It sure is the best option for C++ Windows development, but not for Rust to depend on and use.


But having to install the entire mingw toolchain is also clearly not the best for Rust to depend on and use.


Perhaps part of the problem is perception.

When people download an exe installer on Windows, they generally expect it to install something that works out of the box.

Please note that I am not saying it should be expected that there is a one-click (or double- as it were) solution for Windows at this point, I am just saying that people might ge t that impression from the download link on the front page, and the fact that it is an exe installer.

Perhaps if you put it in a zip -- I am sure the people who can find and follow the notes for installing the rest of what is required can handle zip.


Just noticed that pressing the exe download link on the front page actually takes you to the wiki entry now, that's helpful.


I'd love to help with this, but I'm under the impression that this would also require getting llvm working properly under msvc, which just sounds like an immense task.


What about packing the required libraries in the installer, instead of forcing everyone to play Lego with versions, as starting point?


Thanks for your report. I understand your frustration, and it's true that Windows support continues to lag. Some of the mingw dependencies were eliminated this release but not all. In particular, Rust still depends on mingw's linker. This is a major concern and Rust 1.0 will not be released without a self-contained toolchain on Windows.


>Wasn't this release supposed to finally offer a Windows package which works out-of-the-box?

Nope.

>I just tried it. Only got an error message about a missing GCC component. Seriously, a ~90MB download and that still does not include all the GCC dependencies?

Seriously, an early adopter that can't handle GCC dependencies?

>Who is in charge of the Windows port?

YOU are, it's open to the community.


That's the kind of attitude that will ensure that Rust does not see truly widespread adoption. Windows is, like it or not, still a very significant platform today. Poor Windows support will hurt the adoption of Rust. It shouldn't be up to some vague "community" to seamlessly support Windows. That's something that the core Rust developers should work to offer.


>Windows is, like it or not, still a very significant platform today.

I agree.

>Poor Windows support will hurt the adoption of Rust.

I agree again.

Where I disagree is that there's any serious hurry for "it just works" Windows support to exist now, when the language is in flux, the core team has tons of other priorities (like, finalizing the design and creating stuff), and we're several releases away from 1.0.

Where I disagree is that people that try to early adopt a compiler and toolchain at pre-beta stage, have a right to whine about it and demand stuff, or even get to blame developers ("who is responsible for the Windows port?").

Where I disagree with is with entitlement.

>* It shouldn't be up to some vague "community" to seamlessly support Windows. That's something that the core Rust developers should work to offer.*

That's what I disagree with.

The core team has schedules, finite time and priorities. And could even be way over their head with some stuff, like finalising some language semantics.

That some other developers are saddled with a non-UNIX/POSIX OS, doesn't mean the core team must automatically bend over and prioritize them over other stuff. Even if it "hurt Rust adoption".

You know what will hurt Rust adoption more? Spending resources to port a half-finished compiler to various systems, instead of getting it done first.

[NOTE: In no way do I speak FOR the core team or are affiliated with it (other than playing with Rust since 0.3 days as a user). I just wanted to respond to this particular attitude].


The core team is working on the language for now, I imagine once the few remaining core features start to stabilise they will be able to shift their focus to things like libraries, tooling and platform support.

Also, Windows support is definitely a priority: every single proposed patch is tested on Windows (as well as Linux, OSX and one of the BSDs, and soon, Android), and any failures cause a rejection.


Don't assume that a dismissive comment from some random person on HN is indicative of the Rust team's attitudes.


Agree 100%. Gaming, for example, is one area where Rust can/will shine, and Windows is very important there.


On Windows, I use Scoop http://scoop.sh/ (vaguely similar to homebrew on OSX). There's a Rust package which will install all the dependencies for you. It hasn't been bumped to 0.9 yet, but I assume that'll happen soon.


Just tried Scoop - very useful especially the sudo for Windows.


Well there is always runas


Yep, Scoop has Rust 0.9 now.


Thanks for sharing, I didn't knew it.


You should be able to use any [reasonably recent] MinGW environment with 0.9. Is this not the case?

[edit] You do need to install gcc in your MinGW, though.


I do not know, but if I recall correctly there was an issue with 0.8 where the - then current - MinGW installer would grab packages incompatible with the Rust 0.8 binaries so you first had to setup the current version of MinGW, and then use MinGW's own package manager to rollback to a version compatible with the Rust build.


This had been fixed since. You should be able to use mingw without rolling back anything.


This has been my experience as well, even after trying to follow a few guides.


A selection of some of my favorite aspects of this release:

1. The (yet-ongoing) removal of managed pointers, leaving us with one fewer pointer type. Final tally of built-in pointer types: unique pointers, mutable references, and immutable references.

2. The dead code detection pass (https://github.com/mozilla/rust/pull/10477), contributed by a student of the University of Virginia's Rust-based systems programming class (http://rust-class.org/pages/using-rust-for-an-undergraduate-...).

3. The `Any` trait, giving us on-demand dynamic typing (https://github.com/mozilla/rust/pull/9967).

4. The clean abstraction of green threads and native threads out into their own libraries (https://mail.mozilla.org/pipermail/rust-dev/2013-December/00...) such that any library that makes use of the stdlib will work regardless of which strategy the user selects.

We're not quite in the home stretch yet, but there are very few hard blockers left on 1.0. Here's the list that I can think of:

1. Dynamically-sized types (http://smallcultfollowing.com/babysteps/blog/2014/01/05/dst-...)

2. The extension of rvalue lifetimes (http://smallcultfollowing.com/babysteps/blog/2014/01/09/rval...)

3. Struct single (note that's single) inheritance (http://smallcultfollowing.com/babysteps/blog/2013/10/24/sing...)

4. Niceties and sugar to support custom smart pointer types to complete the excision of managed pointers

As far as I know, the devs are still aiming for a 1.0 release in 2014. The 1.0 release will not necessarily mean that the language is done evolving or ready for production use, but it will mean that the developers will begin honoring language-level and library-level backwards compatibility. I would expect at least two more unstable point releases (i.e. 0.10 and 0.11) before a 1.0 release occurs.


Excellent. It was the multitude of different pointer types that confused me the most, so it's great to hear that they are simplifying this.


My guide got merged in for this release: http://static.rust-lang.org/doc/0.9/guide-pointers.html

TL;DR: you don't need them as much as you may think, so don't let that scare you away!


Inheritance may not make 1.0.


I'm finding that my OO code style these days is very composition heavy with pretty much no inheritance use. I think given Rust's type system, I'm not going to miss inheritance much.


I believe Rust wants inheritance primarily for Servo, because the DOM is defined in terms of inheritance.


Right. Idiomatic Rust prefers composition over inheritance.


Every language in the universe prefers composition over inheritance =P


Not very true.

Some languages prefer inheritance to express is-a relationships and composition to express has-a relationships. Some languages try to shoe horn is-a into being equivalent to has-a, but they usually have trouble expressing non-structural subtyping.


I find that when refactoring code, I often redefine is-a to has-a anyway.


is-a vehicle --> has 4 wheels


Runtime error: Motorcycle lacks 4 wheels


The ones that don't blossom into interface factories which make factory adapters of abstract factories. (I'm looking at you, Java)


For the lack of a nail, throw new HorseshoeNailNotFoundException("no nails!");

For the lack of a horseshoe, EquestrianDoctor.getLocalInstance().getHorseDispatcher().shoot();

(...)

http://steve-yegge.blogspot.com/2006/03/execution-in-kingdom...


If that is the only use case I would rather want another solution to keep the language minimal. Just like the garbage collection was moved to a library.


Can't really do it. I tried. The type checker just plain has to know about inheritance for it to work and be safe.

The good news is that's an extremely constrained sort of inheritance, very unlike traditional OO in that it doesn't have base classes--it's just a special extension to the trait (typeclass) system that allows you to require that structs begin with a certain set of fields. In fact, I don't even call it inheritance these days--I prefer the name "structural constraints".


ooh, is this like record subtyping? Have you seen ermine https://github.com/ermine-language or Ur/web? http://www.impredicative.com/ur/

[edit, on #rust it was explained to me that its only for pointers to structs, but thats fine by me]


If anyone is interested, the proposal for Rust is detailed in http://smallcultfollowing.com/babysteps/blog/2013/10/24/sing...


I think this is a good idea. It seems similar to Go's embedding: http://golang.org/doc/effective_go.html#embedding


The difference is that the base struct can call methods on the derived struct, whereas this is not possible with that kind of struct embedding. This is important for some use cases (e.g. the DOM or the render tree).


I see.


Thanks for the answer, and I think I like your solution to the problem.


I think that the core team thought about this long and hard, but it was decided that there was no other solution. A virtual method call to look up some property shared by all the DOM nodes was just too expensive: a constant offset field look up is far far faster.


Just because people have overused inheritance in the past doesn't mean it's completely useless.


Nobody has said it was useless.

Although with the right language support, it is useless, composition can stand for it (demo: Self)


By that reasoning most programming features are useless (demo: C)


I'm reasonably certain you completely misunderstood my comment, because C definitely isn't an illustration of it.


It's a question of performance more than composition.


Please elaborate?


Ah, news to me. I guess it can be introduced without breaking backwards-compatibility.


As someone who writes a lot of C++ and has played with Rust (ie, hello world) how does the language work when you remove shared pointers?

Say I have an early-generated state tracker for a whole bunch of events propagating in an event loop somewhere, that I want discarded when all the events finish. They are all separate tasks in different threads, so I either need to have some job that just checks if they all finish to clean up, or pass around shared data structures. How would you do that in Rust?


I actually make it a point to do most of my code without shared_ptr, opting instead for unique_ptr. There was a talk somewhere about the "best written library you should never use" and shared_ptr was the focus.

For the example you mentioned, you could have the parent object of those tasks responsible for cleanup of the shared object (e.g. each thread's destructor calls a finish function on the parent object). This is just one possible implementation. I don't encounter this as much because I prefer a shared-nothing approach where possible.

edit:typo


If the structure is immutable after you've produced it, you can use extra::arc::Arc<T> [0], and once all the references are dropped the memory will be deallocated. I haven't used C++, so this might be very similar to std::shared_ptr or not, but this is usually what one would use in Rust. (There's also extra::arc::RWArc<T> [1] when you need mutability, but that isn't used much in my experience.)

[0] http://static.rust-lang.org/doc/master/extra/arc/struct.Arc.... [1] http://static.rust-lang.org/doc/master/extra/arc/struct.RWAr...


Documenting Rust must be a major pain.


Disclaimer: I've written a bunch of it.

It's actually not too bad. The biggest problem with the docs currently is that it's a patched-together set of things people have written at random over the last few years. Mozilla has taken some concrete steps to address this, it'll be much better overall soon.


Sorry, I was making a joke about how Rust changes a lot. You guys do great work considering.


Ha, no worries. Thanks!


This release also heralds the ability for the built-in documentation generator to run any code examples, which has helped us keep the documentation up-to-date a lot.


Aren't unique pointer also scheduled for decommission?


No, not at all. Unique pointers are awesome and are the foundation for lots of other features.

Now, some of the syntax for using them might still change. That's yet to be decided. But it wouldn't be Rust without unique pointers.


Are "unique pointers" == "owned pointers"?


Sorry about the confusion, I just translate ~T as a Uniq<T> because they are unique (i.e. you can't use them without cloning) and they were called like that on a mailing list a few times (IIRC).


Yes, ~T.


Undecided. They will always be somewhat magical in that the "box" operator creates them by default. But the syntax for them may change, or be moved into the library.


You can put a pointer type into a library?


Everything except the two reference types (& and &mut) can be trivially implemented in libraries: Rust is a low level language.

E.g. the following is almost exactly identical to `~T`

  #[unsafe_no_drop_flag]
  struct Uniq<T> { priv ptr: *mut T }

  impl<T> Uniq<T> {
      fn new(value: T) -> Uniq<T> {
          unsafe {
              let ptr = malloc(size_of::<T>()) as *mut T;
              if ptr.is_null() { fail!("malloc failed"); }

              move_val_init(&mut *ptr, value);
              Uniq { ptr: ptr }
          }
      }

      fn borrow<'a>(&'a self) -> &'a T {
          unsafe {&*self.ptr}
      }
      fn borrow_mut<'a>(&'a mut self) -> &'a mut T {
          unsafe {&mut *self.ptr}
      }
      fn move(mut self) -> T {
          unsafe {
              let v = ptr::read_ptr(self.ptr as *T);
              free(self.ptr);
              self.ptr = 0;
              v
          }
      }
  }
  #[unsafe_destructor]
  impl<T> Drop for Uniq<T> {
      fn drop(&mut self) {
          unsafe {
              // this will free the pointer and run the 
              // destructor of the inner type
              replace(self, transmute(0)).move();
          }
      }
  }


Rc and Gc are now libraries you can import. A nice advantage is that Arc, MutexArc and any other reference types look and work the same way.


As always, drop by irc.mozilla.org #rust if you'd like to chat or ask questions. We're a friendly bunch!

http://chat.mibbit.com/?server=irc.mozilla.org&channel=%23ru...


I really want Rust to succeed, but I'm not a systems programmer. Do people feel that there is still a place for Rust among those who typically work with higher-level languages? The functional aspects and type system of Rust look really appealing, and I'd love to do my part to help it do well by actually using it.


I was exclusively a Javascript/Python/Java programmer before jumping into Rust. If you're looking to learn a systems language with real bare-metal capabilities, Rust's safety features will go a long way towards keeping you on the straight and narrow as you figure out the concepts essential to low-level programming.

And because Steve seems reluctant to plug his own book, here's Rust for Rubyists, which is an introduction to Rust geared towards users of higher-level programming languages: http://www.rustforrubyists.com/


This reminded me that I needed to actually release the 0.9 update I wrote over the weekend! Thanks :)


Thanks kibwen for the link, and thanks steveklabnik for writing this. I've been interested in Rust and this looks like a nice way to get my feet wet.


You're welcome! Feedback appreciated.


I want to thank you for a good introduction to rust and can't wait to read the 0.9 updated version.


You're very welcome. I missed one or two things, so there will be another update this weekend.


Anything like a rustforpythonistas been mentioned anywhere?


It doesn't really require a knowledge of ruby, so you might want to have a look.

From the FAQ on the front page:

Do I have to know Ruby?

Really, I love alliteration: the only thing the 'for Rubyists' really means is that I assume you don't know about pointers, concurrency, or similar things. That's okay, you've never had to think about them before! I explain this stuff in extra depth. If you program in another dynamically typed language, you'll be just fine. If you program in another systems language, you'll be more than fine.


There is nothing about the Rust language that makes it ill-suited to higher level work. It's a much easier, safer, language than C++ (which is used for higher level development in MFC) or Objective-C (which is used for higher level development in Cocoa).

At this stage though, the library support for higher level functions is lacking – mostly due to relative youth of the language. At this stage, you won't find much support for UI code and there's a relatively small number of libraries supporting HTTP servers and services.

You could write bindings for these things yourself (and they'll certainly be written over time) but you won't find it out-of-the-box right now.


FWIW, it's worth being precise on what "easier" means: Rust isn't necessarily an easier language to get started in than C++ and so on. Like Haskell, the compiler complains at you, and complains a lot. (Which can be rather discouraging for a beginner.)

However once you've satisfied the compiler, your program often runs perfectly first time (modulo logic bugs, which a compiler can help with (like warning about dead stores, and mutable variables that aren't mutated... which rustc does) but can't detect or fix in general). This is in stark contrast to C/C++ etc where a program normally needs a liberal application of gdb and valgrind to be safe.

Hence: I'd say it's harder to write code but easier to write correct code in Rust.


There'll always be some place for manual memory management - and if the languages for it get better then that space gets wider. I like the fact that they're taking typing seriously, but assuming you don't need the manual memory management I've yet to see any examples of where Rust wins over Haskell/Scala/F#, or even OCaml (though I could well believe it would perform better).


I'd say concurrency support is a big one: Rust rules out data races at compile time, eliminating the need for race detectors, while ensuring that you still have the full array of options when it comes to shared memory, locks, and so on.


What do you mean by "rules out data races"? If I have one piece of code that writes two variables and another that reads them on another thread, there's no way for the compiler to know whether those two changes need to be atomic, serialized, or uncoordinated.


Access to shared mutable memory without use of a synchronization primitive. It's the same definition data race detectors use.


I believe the "data race" definition pcwalton is using is the one from C++11: "a race condition between non-atomic variables". http://en.wikipedia.org/wiki/Race_condition#C.2B.2B


2 threads can't access the same variable in Rust without unsafe{}. I think, anyway...


I am primarily a Rubyist, but I'm head over heels for Rust. I'm pretty sure I've inculcated wycats with similar feels.

You might be interested in http://words.steveklabnik.com/rust-is-surprisingly-expressiv... , which is sort of about this.


Ditto for me, but I came from a largely C background prior to Ruby


FWIW I see a lot of people on #rust who come from a C# background.


I'vr been 'stalking' Rust for quite some time, but never got around to diving in. I'm essentially a web programer who's been building fairly large web based systems. I'm sick of dealing with programming languages that consume shitloads of resources (CPU, memory, whatever) for doing essentially trivial stuff (compared to the amount of resources they use).

Will Rust be able to help me write super-efficient web programs, than run blazingly fast and dont gobble-up RAM?

Besides, how painful/easy is string manipulation in Rust?


Yes, Rust will super-efficient, blazingly-fast, and memory-lean. But it's not nearly as good at rapid prototyping as, say, Ruby, so you'll have to temper your expectations.

String manipulation in Rust is currently lacking while we work on the API. We also strictly enforce that all strings are UTF-8, so while this gives us great confidence that we're future-proof it adds a whole lot of difficulty in implementing a fast and correct means to work with strings.


Can you elaborate why UTF-8 makes things harder? Is it because 1-byte no longer means one character? So, something like calculating a string's length is also tricky. Is that it?

Also, where can I read about the general direction the string manipulation API is going to take?


The bigger problem is that Unicode is immense and complex, and trying to implement the full specification is rather daunting.

There's no real documents concerning the future of the string API, right now people are just implementing things as they need them.


So is the next milestone 1.0 or 0.10?


0.10


with finalized pointers?


What do you mean?


These deep changes with pointers holds me back to try out rust. Last I checked rust was a week ago, none of the pointer related documentation or examples were compatible with the head revision. Putting energy of the released version felt bad as that will break soon but head is not that useful without documentation. The same goes since summer, I'm looking for a stable codebase with documentation and base libraries.


If you're looking for a stable language you shouldn't use Rust right now. The language is still changing every few days.

If you have a bit more time to spend it's fun, though -- I spent a lot of time in the IRC channel. The people in #rust are really helpful and will answer all your pointer questions.


Some of the syntax has been changing, but the semantics have been the same.


> normalized to Unicode normalization form NFKC

I'm wondering why they chose NFKC (compatibility composed from) instead of NFC (canonically composed form).

`ª` would become `a`, losing its super type. `ᵤ` becomes `u`, losing its sub type? `Ⓐ` becomes `A`, losing its circle type. As for multi-codepoint mappings, `¼` would become three tokens `1⁄4`, where `⁄` (U+2044) doesn't map to `/` (U+002F).


Maybe you are referring the reference [1], which indeed mentions NFKC. As far as I know there is no consensus of the normalization form [2] and the current implementation is not guaranteed to stay, which is why Unicode identifiers are gated behind a `#[feature]` flag.

[1] http://static.rust-lang.org/doc/0.9/rust.html#input-format

[2] https://github.com/mozilla/rust/issues/2253


Yes, I started reading the reference. The normalization form issue is different to the #[non_ascii_idents] feature, though.

The issue 2253 does mention address it, but all the comments mention the issue of NFC/NFKC normalization specifically for filesystem lookup and for program identifiers, but not for the lexing stage. That issue is obviously the best place to continue any conversation about it.


(Also, as that bug suggests, we don't actually do any normalisation at all yet.)


I don't see any mention of normalisation in the release notes.


Rust does seem to do lots of good and interesting stuff. I was trying to find a small number of languages covering all paradigms, and Rust featured on the final list:

https://news.ycombinator.com/item?id=7026970


Of course I welcome removal of GC syntax, but I am getting in doubt on when Rust language specification can be stabilized. Version 0.9 seems it must be stable now, but they're still putting big changes on the language.


Indeed, in the entire history of humankind there has never been a stabilized release of Rust. History is clearly against them.

More seriously, look at the release rate. It's pretty frequent. There's clearly heaps of work going on. It's not a moribund project.


In general: One cannot expect a 0.x version to be stable/won't be followed by a incompatible version. A 1.x version is a different story.


Well, it's understandable where the concern is coming from. We've seen 0.x releases of Rust for nearly two full years now, and even the most recent ones have had significant changes. While we've heard that a 1.0 release is planned for sometime during 2014, there's little to suggest that things are actually stabilizing. It isn't Perl 6 yet, but the ongoing lack of stability is making some potential Rust users become skeptical.


There have been fewer and fewer breaking language changes over time. The migration from 0.8 to 0.9 was less of a burden than the migration from 0.7 to 0.8, and so on. There is also a lot of progress on making a list of blocking issues, and at this point if anything can be added backwards compatibly it is almost always not a priority for 1.0.

Making a safe programming language that doesn't require GC or a runtime (while allowing allocation) has literally never been done before in industry. It has taken time to get something usable. The current overall design seems pretty workable now, as evidenced by all the projects people are starting to write, so I think it's basically just a matter of finishing off rough edges at this point.


If anyone's curious, here's all it took me to update Rust for Rubyists from 0.8 to 0.9: https://github.com/steveklabnik/rust_for_rubyists/commit/ae2...


Thats actually a lot less than i'd expect.

Kinda curious on the do N.times -> for foo in range(0, N) changes. I thought I remember a recent post from you about that. Any reason for the switch? (I ask as someone thats used ruby too much myself and is currently switching cold turkey to plain old C while rust gets a bit more mature)


It's since `do` now does not apply to stack closures [1]. `Do` still supports owned closures (that's what used to be called `~once fn()` in 0.8 and renamed to `proc()`) but they can be called only once which defeats the whole point of `times` (unless you are fine with `N.times(|| { ... })` syntax).

[1] https://github.com/mozilla/rust/issues/10815


Very interesting thanks!


Either they do this now, or they get stuck being unable to do it after 1.0.


Are there any plans to further simplify and/or pare down the language?


Yes, the @ built-in pointer is being removed as special, replaced by library types like Rc and Gc (for reference counting and garbage collection respectively). There's a vague suggestion for ~ to become a library pointer too, but I don't think this has been totally accepted by the core team. (This would leave only raw pointers and references actually defined in the language.)

There are some plans related to simplifying how vectors and trait objects ("dynamically sized types" or DST) interact but I don't understand them well enough to be able to explain. :)


OT: Is Rust using operator overloading? Is user-defined operators a possibility in the future? I googled it and the most I could find was this:

http://www.reddit.com/r/rust/comments/1le6vu/i_wrote_a_proto...


Operator overloading, yes. But we place some restrictions on which types you can overload in order to keep things sane (oversimplified explanation: you must have declared the types yourself in order to overload operators on them). We also discourage the wanton overloading of operators to mean completely random things, as you might have encountered in C++.

As for user-defined operators, if you mean something like Haskell's custom infix operators then no, I don't believe there are any plans for those.


> As for user-defined operators, if you mean something like Haskell's custom infix operators then no, I don't believe there are any plans for those.

Do the Rust designers actively oppose having them, or just not have any concrete plans to add them?


Merely lukewarm, I think. There are simply much higher priorities to focus on at the moment. Who knows what Rust 2.0 might hold!


I've always found the profusion of infix operators in Haskell to be heinously confusing, and seriously degrades readability and usability of libraries.limits on the o


The problems I see with infix operators are:

- It's not necessarily obvious what is an infix operator

- Precedence and associativity

I think Haskell has solved the first one: with all non-alphanumeric symbols in expressions being infix operators, and likewise for alphanumeric functions written with backticks.

But precedence and associativity is not obvious, since that is something that you can customize. I think that user-defined infix operators with some severe limits on choosing precedence and associativity is a good compromise (many use backticks on functions in Haskell, and that has a default precedence).


That's great. Really, the only case in C++ I want operator overloading, is to implement standard math operators on custom types, like Vector3s, Matrices, and Bignums.


Assuming you use smart pointers, you'll want overloading for pointer operators as well, even if you're not implementing the smart pointers yourself.


I'm not however sure that applies to Rust. And frankly I care a lot less about the pointer operators for smart pointers. Any old accessor is decent, and I find the overloaded pointer operators sometimes confusing. At the very least it's not quite as obvious as it is for math operators.


I think that Reddit thread was interesting. The syntax is taken from Haskell, but someone brought up ditching precedence and only associating in one direction, which I guess is simpler than operator overloading(?)



There is this: http://maniagnosis.crsr.net/2013/04/operator-overloading-in-...

I think a current version should still be similar, although I haven't had a chance to take a look at the code for 0.9. I also notice that I forgot to update that with the double-dispatch approach.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: