35 comments

18

Boring code is maintainable code. I once heard the statement along the lines of "If the highest level of human intelligence was used to code something, how could a human ever fix it when it is broken".

4

That's actually a really good point. Properly formatting, documenting, and testing code is probably a lot more important than a clever solution.

14

I think that considering those guys are generally coding stuff that must work for years without human intervention without segfaulting and stuff, it's quite relaxed. You have to consider that those programs run on computers with somewhat limited CPU and memory boundaries. This explains why it's forbidden to do dynamic memory allocation after the start of the program. For a usual desktop application that'd be way too strict of course. The rule to also use the smallest possible data type is just an artifact of the limited memory resources.

I like the intention of having an upper bound for loops. Again, think of that it's impossible for a human to just restart the device after it started. If that thing then hangs in an infinite loop of any kind, it'll kill the entire mission and burn millions of dollars with it.

And the other points are 'just' artifacts of a "God damnit write readable code". IMO the document is perfectly reasonable for what they do.

1

Yeah, it certainly makes sense under the circumstances. A lot of these rules seem restrictive at first, but after thinking about them for a while they begin to make perfect sense.

3

There are certainly some of these I wouldn't follow 100% when writing user-end desktop software because they seem too restrictive. For example, keeping functions to less than 60 lines and having at least 2 assertions per function seems unnecessary, and doesn't necessarily mean the code is safer. Outright banning recursion would also make a lot of data structures algorithms that much more confusing.

But I can see why, when the code is needed to ensure the safety of astronauts under extremely dangerous conditions, one would want such strict rules. And certainly a lot of these rules can cut down on erroneously behavior. For example, dynamically allocated memory and abuse of pointers are often a cause for memory leaks leading to undefined behavior. By banning them, these rules make it less likely to be an issue. Further, good assertions will be able to throw an error when such a leak occurs, leading to exceptional circumstances. It also seems most of these rules are there to prevent strange behavior which can happen in C but is already restricted in other languages (ie, Java doesn't even have pointers).

7

I actually think the 60 lines per function is a very good rule.

A function with that many lines will be much harder to read/understand than a regular sized function. It probably also contains repeated code.

Most of the time, this kind of function can be splitted in several smaller functions that will be much easier to test, read and understand. Doing so might also reveal code repetition that can factorized in a single function with parameters.

There are exception to everything of course, but I learned to code with a strong limitation of 25 lines per function and I believe it helped to improve the quality of my code dramatically.

1

Interesting. Good point that one should be breaking a function into subfunctions for readability. Reminds me of the law that every function/variable/class should be responsible for one things and one thing only.

1

My general rule of thumb is: A class, or file containing related sets of code must fit on the screen without needing to scroll.

It's kind of arbitrary because a person with a 4k monitor is going to have more code than my 1080p screen. But the general reasoning behind it is, a function should do one job. If it takes up more than a few lines it is probably doing more than one job and should be broken up into multiple functions.

Likewise with classes. If a class can't fit on one screen without scrolling it probably is really more than one class and should be broken up.

Sometimes there are good reasons to ignore this rule and in such cases the rule should be ignored. It's more of a smart guideline. If your code takes up more than one page there should be alarms going off in your head telling you there likely is something wrong.

1

I'm having trouble envisioning this working with classes. If a class has 10 properties and a constructor you're already filling the entire screen without having declared any methods.

0

Like I said, if there is a good reason to break the rule, It's fine to do so. It's more of a way of being self conscious about good practices.

Your comment got me questioning myself to see if I actually follow this and how strictly I do so. So I did an analysis on a library I'm writing. The average lines of code in my project (per file) are 55. This is excluding "Wasted lines" like comments, and "Documentation" lines which can be collapsed by my editor.

At my resolution, I can see 51 lines of code at any time. So, I'm breaking the rule on average by 4 lines. But I think it's within the acceptable margin of error.

Then I did an analysis on the code base which uses the library. the average number of lines was 46. So, I would have to say the technique works, in practice, for the most part.

Another thing to realize is I'm using "D" a language with a lot of syntax sugar, so that helps. C is a bit more verbose and tends to require more lines to do the same job. For instance, the garbage collector pretty much makes the de constructor unnecessary;

Also I prefer putting my code blocks on the same line as the declaration, for instance:

void foo(){

as opposed to:

void foo()
{

so that also helps.

3

Less complexity = less bugs = less dead people in space

3

For C, which these rules were almost certainly specifically for, these make a lot of sense in eliminating its most common pitfalls.

However, despite C's elegance and simplicity, I can't help but get the feeling that these rules are just a band aid, and that C should not be involved when memory safety is important. Go, Rust, hell, even Racket are probably a better fit for such cases.

5

Any language which uses garbage collection is automatically out when it comes to mission critical systems. For instance, life support.

Why? Because part of the cleanup process is to briefly halt all threads while the collector does it's job. This is fine for most applications, like a word processor. But you don't want arbitrary garbage cleaning on your space station. Do that shit in a predictable way, like C does.

1

I usually do everything says here. I agree with everything except this:

All loops must have a fixed upper-bound. It must be trivially possible for a checking tool to prove statically that a preset upper-bound on the number of iterations of a loop cannot be exceeded. If the loop-bound cannot be proven statically, the rule is considered violated.

I would call a while loop is a perfectly valid loop which can have, potentially no upper bound. For example, while users doesn't press exit, keep doing stuff. Maybe I'm being pedantic here and it only refers to for loops.

1

I certainly agree with you, when it comes to while loops. Most notably, the main loop for a given application may never exit if the user never exits the program, as you mentioned. It would be silly for your application to just spontaneously exit so that you can enforce an upper bound.

Makes sense when it comes to calculations, though. Especially in functional programming, every function should either be able to determine the return value or realize it's caught in an infinite loop and exit.

1

They need to prove correctness of their code (checking tools, static analysis, yada yada), and I guess there are limitations regarding infinite loops and correctness.

1

This is not used in all of NASA's software:

"These ten rules are being used experimentally at JPL in the writing of mission critical software, with encouraging results.

0

That's for pointing that out. I should have read and linked the source PDF rather than this article. Oh well.

1

This makes a lot of sense when you know it is only used in mission critical software.

1

As a programmer, most of these make sense in normal use. They mostly encourage the creation of readable (fixable) code that is easy for anyone to look at and maintain over the years the code is used. None of these rules are unreasonable for code that people's lives rely on.

1

No direct or indirect recursion? Not even compile-time? Just shoot me. And fixed upper-bound requirement for loops, restriction on dynamic memory allocation and no function pointers... well that certainly reduces the complexity of the programs you'll be able to write. I suppose that's the intention.

More critically, I see sensible requirements for checking validity of parameters and return values of non-void functions. No mention of exception handling apart from assertions though. I would definitely like to see sections on load balancing, threading, resource management, feedback loop control and recovery from critical and non-critical failures.

1

Maybe you should send this to Elon Musk and the boys ar SpaceX? Good code might be boring, but...

1

I guess I'm out of the loop. Did something bad happen at SpaceX due to poor coding?

1

Their last mission to the ISS blew to hell. I'm assuming that it wasn't due to good code. :)

1

It always depends on the language and technologies used. Code clarity and maintainability is a key component in most softwares. Writing C code is way different from writing Scala code, static typing helps a lot in writing better code, rules are way different between the 2 languages.

I honestly would not use C to code a life-or-death system, ever! In fact, the most used programming language in fields where failure could drive to tragedies is Ada. The list of the Ada applications in fields like aviation, air traffic management, power stations, medical industry, etc... is very long. Code guideline is important, but it's more important to pick the right technology to do the right job.

1

I definitely agree; I would not want to use C for critical situations. So many times I've had strange behavior happen, but absolutely no warnings are crashing. C just makes it too easy shoot yourself in the foot and not even know it. But on the other hand, NASA needs low-level access to hardware (to interface with payloads, etc) and they need a lightweight programming language that's available on a lot of architectures. This really limits the languages they can use. At least these guidelines make writing in C safer.

1

For sure NASA has hardware constraints to deal with. I can't agree with people quoting NASA code guidelines or CERN's one when they are implementing a REST webservice delivering the current weather. Every technology has some guidelines, as well programming languages. At the bottom we have the project context that is important, in this case NASA uses C.

I had to write a geo-localization software for a satellite during my bachelor, it was the start point of the actual one in use right now, I know how hard it can be having so big constraints... :)

0

I had to write a geo-localization software for a satellite during my bachelor

That's pretty awesome.

1

I'm going to advocate for indirect recursion. Stuff like map or fold guarantees you, that you won't get infinite loops.

Direct recursion however can get ugly and I also think it's best avoided.

0

For enterprise code this would be horrible because it forces the code to be very hands on and near the metal. It makes complete sense for NASA though. To reference Jurassic Park "Yeah, but, John, if The Pirates of the Caribbean breaks down, the pirates don't eat the tourists." If your code underflows, overflows, or generally bugs out it doesn't ignite 200 tons of multimillion-dollar liquid death on the users.