This situation is why there is "hello world"; that program seems trivial but actually has important implications; for instance, that your environment works, that you know how to feed programs to the compiler, that you can actually run the output.
I never took "hello world" seriously until I started doing embedded systems work; it's now an extraordinarily important tool for me (I work an anomalously weird number of different environments).
I once had a particularly slow week at work, and decided to write an article about how "Hello World" works in C. How it really works -- including compiling, linking, system calls, operating system involvement, hardware I/O, communication between ICs, etc. The path from the block of text to photons emitting from the screen, with fairly detailed explanations at every level.
I ended up with a 50 page LaTeX document that barely even scratched the surface, and then work picked up and I never finished it. A "full" explanation would probably be a 600+ page book.
I still think it would be a neat resource to have. We kind of take for granted that the simple "Hello World" is built on 60 years of research, and is anything but simple.
A wee bit heavy, but it's comprehensive. It deals with what happens when you run code, how the architecture of the computer works (by and large) including at the logic level:
If you want to go lower (and higher).. look at Understanding the Linux kernel for a good understanding of how an OS is put together, with specific examples i.e. Linux.
Code, by Petzold, deals with logic and computers from the ground up. It starts with relays and builds them up into gates and usable arithmetic blocks.
The physics is fairly simple, at least from a CRT or LED display perspective. Gets more tricky dealing with interconnecting microprocessors because a good chunk is vendor specific.
I think this kind of project is well suited to a guide on how to build a computer from the ground up, starting with logic gates, writing a real time OS and developing a scripting language that will run and compile on it. Then you can skip a lot of largely extraneous stuff and have a solid understanding of how the hardware works.
> How it really works -- including compiling, linking, system calls, operating system involvement, hardware I/O, communication between ICs, etc. The path from the block of text to photons emitting from the screen, with fairly detailed explanations at every level.
WOW, you should post that LaTeX document regardless of it being unfinished! Just the fact that you spent so much effort, requires some sort of release :)
This is the sort of thing I was looking for when I started wanting to understand the 'big picture' of computer programming. Would be great if you posted it!
Please, please post this. Or if not would you be willing to share it privately? This is the sort of reading that really gets my rocks off. My email is in my profile if you're interested at all. I really would love to see this.
Agreed. "Hello world" is how you test that you are able to create a particular software artifact at the most basic level. My first kernel device driver was a NetBSD one called /dev/hello; it simply spit out "HELLO WORLD " over and over, for as long as you cared to read it. But it showed me how drivers do I/O and that I could write one and link it in.
Another important outcome of a successful "hello, world!" is that you find some kind of MVP - Minimal Viable Printf.
Just because it is C doesn't mean you've got a sane libc. Let alone the luxury of a debugger. Just seeing that simple string on the screen/logfile can be a blessed relief.
I learned C by reading K and R and doing the book's exercises . . . on paper. At the time I didn't have access to a C compiler, so I wrote them all out in a notebook. A month later I got a job at a shop that was running Unix, and got the chance to type my programs in and try them.
I had a lot of things wrong. It took me a while to understand the difference between control-D and EOF, for instance (how embarrassing). But the 30 days I spent without a compiler made me think about program behavior.
I'm not saying this is a great way to learn a language, but it can be done.
I keep hearing people complain about K and R being "a terrible book." For me it was perfect: pragmatic, succinct, with great examples and good exercises.
And that is what anyone who goes through K&R will do: Reading and writing lots of real programs. A simple program is worth a thousand words!
Which unfortunately the majority of technical books just can't get right. Explain the concepts in 5 full dull pages, and then at the end "Hey, checkout this little snippet of code, which by the way does nothing really interesting, but is here to illustrate what the author was talking about :)"
I'd like to add some emphasis on reading. For a beginner, writing is obviously very very important as the way to get to the point where he can just sit down and solve a given task.
But reading lots and lots of code (from many different sources) will help him pick up idioms and find common & good solutions to specific problems. Even for a smart person, I think it'll take a lot of time and effort to arrive to the cleanest way of doing things.
Of course there's a risk of picking up bad habits, but if you read lots of code, you should eventually develop a feel for what's clean and readable and easy to understand, and what's messy and wrong. This sadly doesn't help with issues like undefined behavior, but you don't learn these just by writing either. For that you need to look into the spec or some other text that'll cover these.
EDIT: I like to think that I'm a fairly good C programmer (having coded nearly all my life, with C mostly), but I still peek into others code all the time to find out how they've solved some things I'm about to do.
I'll second this notion. I no longer write C, but it is what I learned in high school. My teacher had an interesting class requirement: we needed to review one C/C++ Users Journal article. Length, difficulty, or topic didn't matter - just read a trade article and write about it. We also had to code a lot, and we had to write code by hand on paper tests, and we had some fun competing, but the journals let me know what the Real World was doing.
It takes a large volume of work to really get into programming, and I think that reading is easier than writing, so it's a fast way to dig in (keeping in mind that feedback is critical - so reading without writing has severe diminishing returns)!
People don't like K&R? I can understand it not being suitable for children and the very beginner, but it's one of a very few programming books that is useful, succinct, pragmatic, doesn't get bogged down in APIs, and is written clearly.
(The companion book "Programming in the UNIX environment" is the get-you-started guide from the very beginning, although it assumes 1970s terminal defaults)
Edit: Most resources on the Internet are too confusing for beginners. As a matter of fact, people who know C are usually unable to teach C to novices. Good introductory resources for C are rare - which corresponds to C's elitist nimbus.
That's great. I learned C by telling people I had C experience then when I got a job offer I was forced to learn it pretty quickly over a weekend powered by coffee and yorkie bars and a pirate copy of Microsoft C. I had plenty of experience with assembly at the time.
fortunately I wandered into a company that had even less of a clue than me. It's nice to be an expert on day one with only two days' of expertise :)
And that may give you a leg up on the rest of us. Not enough thinking of this sort gets done.
In fact, this is how one of the great teachers of CS, Dijkstra, taught computer science.
I have been known to complain about the second edition--the first is the one that I go back to. But it is an absolutely fundamental book for understanding the practicality of programming. I agree with your assessment.
I find the visceral act of manually writing code to be an extremely effective tool when learning programming language syntax. I find it similar to the "Write these words 10 times" technique used when teaching grade school kids how to spell and read.
I guess this is sort of I'm thinking about this, that is forcing people first to think how something should work, rather then just duplicating some code and seeing that it works. Thanks for sharing!
It's essential to follow pjmlp's advice, since C does otherwise not give a lot of feedback about one's mistakes: learning from one's mistakes might take a long time since a quite broken program (think overwriting past memory strictly allocated for one entity) might still lead to a working program.
Use your compiler, use it well with all its warnings. Run your programs under valgrind or some such.
Nice suggestion. Although I can't seem where one would have to look for those "pretty colours" you mentioned, I certainly can't see any of them on my Mac terminal...
"-Weverything" if the code is compiling perfectly, but you're just bored. If you're not a complete masochist try turning off some of the sillier ones (-Wno-padded -Wno-unused-parameter -Wno-conversion)
Absolutely, but my way of going about this would be to first show _why_ something is a best practice, rather than forcing people to take it at face value.
A more conservative rule would be not to use C where security is a concern, unless you know what you're doing.
When you're writing the kind of software that's an invitation to hackers, like a web application, you should favor a language like PHP or Ruby, which takes things like buffer overflows out of the equation, and even then, you should know what you're doing.
> A more conservative rule would be not to use C where security is a concern, unless you know what you're doing.
Which if you follow my posts, you will see that I defend C and C++ should be replaced by safer systems programming languages, that exist since Modula-2 days.
Having said this, C and C++ are still used everywhere and will outlive most of us.
So when using them, for whatever reasons, at least one should take care to use the best practices regarding how to write secure and safe code in those languages.
Where to begin when learning C? Start at http://c.learncodethehardway.org/. It's going to be tough to top Zed Shaw's approach. The best way to learn code is by writing code.
I think I have some ingrained fear of making errors, from the days where a simple error might cause your program to chew through your whole hard drive. Actually I don't know if that was ever the case, but that's how I felt.
Anyway, the hardest step for me in learning a language is when I take some running code and make one change to see what happens. Once I get in the swing of it, it gets much easier, but that first step is still hard to do.
In order to for people to really understand C and be proficient in it very clearly, I always recommend that they learn the basics of programming (memory, basic types, looping structures and array manipulation) in a simple assembly language such as 6502. This helps immensely with understanding pointers and deciphering the many cryptic C compiler messages.
Install Valgrind. It makes error messages a lot less cryptic (my #1 problem with C... not that's it's limited to C). If you don't get proper feedback, it's not learning, just banging your head against the wall.
I've had a heck of a time getting Valgrind working on a mac (especially since Mavericks), hardly finding any resources about it. Is there an alternative, or do you have any advice on getting it to work?
In a similar vain, learn how to use a debugger, such as GDB or DDD. Even the bare minimum (getting a stack trace from a crashing/segfaulting program) is incredibly helpful.
Currently, I'm learning Python and have been learning it for over a year now. I've been thinking about moving to C as my next language. Is this a good idea? I'm eventually going to want to learn C++, but I'm not going to learn Java until I have to. Also, what are some really good C learning resources? I know there's K&R and learncthehardway.org
As other people said earlier, reading existing code is great, when you have good sources. I'm an almost noob in C, and currently reading Ian Piumarta sources[1], I find they're superb in presentation and design.
The error message says: Undefined symbols for architecture x86_64: "_main" (etc.) If someone truly knows nothing about C, how does that person go from that error message to "In the case for C, the entry point is defined by the “main()” function." I'm not clear on where the beginner is going to go to "dig deeper and understand what it is that it’s trying to say."
As for going further staying with the bare-bones approach, I suppose you'd have to start looking at assembly output and how that fits with what the c-code does. I don't know anything about OSX x86_64 calling conventions etc -- but at least under Linux (and afaik windows) 64bits is a lot more friendly and fun than the mess that was 32bit (and 16bit) x86.
There are a couple of great (free) resources on 32bit x86 assembly I'm aware of:
There's apparently some plans on upgrading HLA to x86_64 -- I don't know of any good tutorials or guides on working on 64bit assembly specifically I'm afraid.
Just adding "-S" and looking at the source can be helpful of course, although I much prefer nasm/intel syntax, for clang/gcc that should require:
Note that gas syntax is the "default" in the gnu-world, so it might be easier to just go with that if you're just starting out.
It looks like clang might be generating less "noise" for tiny trivial
programs, here's a side by-side-diff (in intel syntax) of int main{} vs
int main { return 0;} (slightly reformatted):
This is stupid. You should learn C properly (know how to avoid undefined behavior) or not at all. Exception if you're doing something that can tolerate remote code execution.
Actually the message quoted complains about the lack of _main not main, so the rest of this post is a huge overinterpretation. _main is IMO some windoizm connected to the use of WinMain for GUI apps.
A C program starts at the function "main", what you're discussing here is beyond the scope of the C language. C without the libc or any other form of runtime is not standard C.
I'm not trying to nitpick but I'm worried newcomers to the language might be misled by your comments, the things you're talking about are not a concern for most coders unless they have to do things like low level embedded code, bootloaders and things like that. And then your entrypoint won't be "start" anyway, it'll be the reset vector or some lower stage jumping to a specific address for instance.
Sure, I'm only trying to criticize the idea of learning language from error messages triggered by invalid (due to standards) input, as they directly take you to implementation details and other dirty stuff.
I never took "hello world" seriously until I started doing embedded systems work; it's now an extraordinarily important tool for me (I work an anomalously weird number of different environments).