What is the fast track to developing for Linux?

15    16 Feb 2016 19:11 by u/roznak

I am an experienced developer (including C++), but never developed for Linux.

I don't want to lose time and energy in reading tons of documents and forums discussions.

So what is the fast track to start developing in Linux (preferably Mint right now)?

  • Project folder structure (where goes the source code, the compiled code, the libraries)?
  • Testing, debugging unit testing, different Linux variants?
  • Deployment? (Installing/uninstalling) Different Linux variants?

The end goal is to have a build structure using scripts that automates complex projects into an easily build. So I want to do it right from the first line of code.

Linux Mint apps is my first priority (Cinnamon) because I really like it. Second priority will be in device drivers but first I must have a grasp of normal developing.

EDIT: Thanks for the info people. :-)

13 comments

3

I'm not a C++ developer, but your question made me curious about project directory structure. I just looked at trending C++ projects on GitHub and thought the RethinkDB project seemed like it might a good starting point. Hope that helps.

Another project that might be of some indication, but seemed completely different, was the Vulkan samples project.

1

That is exactly what I need. :-)

3

If you want a grasp on how Linux and the rest of the Linux ecosystem works, install Gentoo. I know it's a meme, but you'll learn a lot from installing it. You can just do it in a VM too, no need to install it and start wiping parts of your storage.

2

So, what language you want to use? What tools or IDE you are used to?

P.S.: Cinnamon is GTK+ based, so you could start from there.

1

I think for Linux, it should be C++.

But I am more interested in the folder structure of c++ projects. I can create a hello world program, but how would a "professional" Linux application project in source folder structure look like, that scales well? The folder structure is completely different than a Windows structure.

2

First chapters of 21st Century C describe workflow in POSIX environment quite well.

2

First off, DON'T target your program for a specific distribution. That very significantly limits your audience and the utility of your application, especially if you're going out of your way to use distribution-specific features. That said, I don't think it's terribly easy to go about doing this, so you should probably not worry too much about it. There are SOME things you're going to have to choose on - probably the biggest being Gnome vs. KDE vs. generic - but that's not nearly as dire, especially since most distributions have the files for both for compatibility (and they also have the bitching from the fanatics of both for doing so).

Second, I am going to warn you that my knowledge is very dated, and I focused almost entirely on C. As someone pointed out, C++ is something of an 'odd man out' for a lot of things on Linux, although some very noteworthy subsets of the Linux ecosystem use it extensively, particularly anything relying on KDE and/or Qt.

In terms of the file structure for a project, that's going to depend on your development methodology, preferences, and the tools you use. There is no one, single way of doing it. There are projects that compile multiple libraries and binaries and all run out of a single directory, there are projects that use a vastly complicated array of directories for source code organization, compilation, and supporting files and documentation, and everything in between. You're best of choosing whatever is most appropriate for your situation. However, I have noticed a tendency for there to be a bin subdirectory for compiled executables, a lib subdirectory for compiled libraries that are a part of your project, a hdrs or include subdirectory for header files, and a src subdirectory for the actual source code. Each of these may have their own subdirectories, particularly the src and header subdirectories.

In terms of tools, there are a ton of them. The most universal is 'make' and learning to use makefiles is a good thing. Someone mentioned autotools/autoconf, which can be very important for determining the exact specifications of your target system, and this is important if you are planning on distributing code that the user is expected to be able to compile. In fact, make is very critical, such that there are a lot of projects that use a text editor (most often vi or emacs), the shell, and make, and nothing more. Makefiles are sometimes even used for installation and uninstallation of binary packages without any source involved. You will probably want to use a version control system; git seems to be all the rage these days. I don't know what the status of CVS is, although I suspect it's still serviceable, if not the most spiffy and up-to-date.

However, more advanced IDEs are also available, including a few available on Windows, such as Netbeans and Eclipse (both of which can use C++ even though they are Java-oriented). A good list of some of the most popular is here: http://codecondo.com/10-best-ides-linux/ A more complete list is here: http://linuxmafia.com/faq/Devtools/ides.html And there's plenty more where that came from.

Libraries to use - there are a ton of them. The most important is probably the windowing toolkit, assuming you use it. Your choices are myriad, and you're going to have to do some research. GTK and Qt are the most common these days, but there are plenty more, and you really should examine what's available. I would NOT base code on the basic X11 libraries, nor Xt. These are the raw API to interface with X11, and a sample toolkit, respectively. X11 API is difficult and hard to use for most situations, and usually best left to those who write windowing libraries. Xt was, I believe, a sample implementation of how to write a windowing toolkit that kind of took off on its own, much to the chagrin of X11 users throughout the late 1980s through mid-1990s, and is probably best buried. Motif is also a name you may hear, and it is best not to bother with that, either; it was based on Xt, and although it was THE standard for high-end X11 across the industry in the 1990s, it's basically faded into next to nothing these days with the rise of alternatives such as GTK/Gnome and KDE/Qt.

In terms of how to distribute it, the standard is usually, to cover all bases, aim for a .deb (probably based off of Debian or Ubuntu), .rpm (probably based off of Red Hat, which is known as Fedora in its non-commercial version), and tarball (for everyone else). If you plan on distributing source-only, a tarball is usually what you'll be using anyway. Where the files go, however - they usually go in some fairly generic directories. The specific standard is here: https://wiki.linuxfoundation.org/en/FHS However, for a brief overview, /bin is for important programs, /etc is for configuration files (you should probably not use this unless you're building something important for the system itself), /lib is for important system level libraries, and /sbin is for administrative tools that are extremely critical. /usr is where most of the regular programs go; most of your binaries will go in /usr/bin, and /usr/lib is where most libraries go. /usr/local is where people go most often to dump in their binary tarballs or custom-compiled applications. /usr/X11* are a series of directories you likely have little need to go to, so I won't cover those, other than they tend to be low-level X11 stuff.

As someone else mentioned, if you are distributing source, consider using automake in order to generate makefiles on the fly. I would very strongly encourage you to pick up several distributions and run them off of virtual machines to test installations on all of them. As a decent mix to test with, particularly for a "serious," commercially-inclined application, get Debian, Ubuntu, SuSE, Red Hat, Mint, Arch, and Slackware (for the lowest common denominator), and whatever else is high on the distrowatch.com charts. At a bare minimum, test against Slackware for compilation. Please do not jump me, fans of other distros, I just want to give something to start with. You may also consider some up-and-coming distros such as Elementary and Zorin. Note that most of this does not apply in the same way for Android; it's a Linux and uses the Linux kernel, but there are numerous differences in userland, which is where you'll be spending almost all of your time (unless you want to get into systems programming, in which case, you're not going to be using C++, but straight C, no realistic negotiation there, end of story).

As per uninstalling, the major package managers uninstall things for you, don't worry about that. Uninstalling without a package manager usually consists of deleting whatever you put in, possibly leaving user configuration files.

Other languages such as scripting languages are also possible, but they have their own ecosystems that have their own norms and the like. These include Perl, Python, Ruby, TCL/TK, and a variety of others. Note that with many of these, they tend to be somewhat trendy and come and go, so I would advise you to consider this when choosing your language, should you opt for scripting. I may be dating myself by even mentioning TCL/TK, in fact. Other languages, such as Java, are also readily available. I believe FORTRAN and Pascal are also available. NASM is available for Linux, but you'll have to look up the information on using that, and I don't advise it unless you want to write for the kernel or extremely low-level libraries (which would be an extremely bad idea at your level of experience).

As a note, depending on your goals, you will likely want to learn more about Linux architecture and Unix-specific programming conventions, such as signals and raw sockets. I do not have the name of a good book off the top of my head. Note however that this is fairly advanced and you may not need it right away, so while it is very helpful to know this stuff, do not overload yourself trying to learn how to write signal handlers, especially since, depending on the libraries you use, you may never even be able to touch them directly anyway. Learning a bit about X11 architecture is also good, although the specifics of writing code specifically for X11 is probably not going to be too helpful, as it is a very rare situation that you would want to do something that couldn't be handled by any mainstream full windowing toolkit. X11 may be on its way out anyway.

As per testing, I'm afraid I can't help you there, but debugging has one king on Linux - gdb. Learn it. A lot of IDEs, if I recall, have debuggers that are actually front-ends for this. HOWEVER, gdb is not the only option. There are a number of debuggers, some of which are much more visually-oriented. Some of these are here: http://www.drdobbs.com/testing/13-linux-debuggers-for-c-reviewed/240156817 and here: http://toppersworld.com/top-10-visual-linux-debuggers-for-c/ but I cannot vouch well for any of them.

Also, one last thing - if you have specific licensing needs, be sure to research the EXACT specifications as per the licenses of the GNU C++ standard libraries. I think there may be some licensing oddities there, but I am not sure, and if it was there before, I'm not sure it is now. In fact, be aware of the legal nuances of any libraries you do use. Some are under various free software licenses that will require you to release your code; while most of the standard libraries are not really impacted by this, some could well be, particularly niche libraries. You are unlikely to need a lawyer for anything non-commercial, but if you are in doubt, ask.

DISCLAIMER: Once again, my opinions only, much of this info is old, YMMV, I didn't do a lot of research for this. I am 100% sure I forgot things that people are big fans of; to them, my apologies, but this should show just how wide the Linux world is, and how many possibilities there are. Good luck.

1

One additional note that Voat didn't like me putting in (post exceeded 10KB):

In terms of installation and libraries, BE PREPARED FOR HEADACHES. Linux is something of a minefield for installation, and in particular, library woes can sometimes be extremely difficult to solve, to the point where it isn't worth trying any longer for the user. Try to avoid that fate for your users when choosing libraries, and be very clear about which ones to use and which versions to prefer. This isn't to scare you, and you'll probably be fine, but this sort of thing can and does happen, so watch out for it and be sure to test running binaries and compiling on other distros. IMO, Linux software installation is a vast, gaping hole in the armor of its push to the desktop.

1

Impressive reply, I have good starting points now.

0

This totally reminds me of an old martial arts saying.

A young person goes up to a great grandmaster of a martial arts and asks "If I were to start today. How long would it take for me to be a great fighter?" The grandmaster says "5 years". The youth responds "But I am smart and a quick learner. How long now?" "10 Years" grandmaster speaks. The youth quickly replies "I will work hard every day. 16 hours a day". The grandmaster answers very sternly "50 years!"

edit: formatting