u/tribblepuncher - 60 Archived Voat Posts in v/programming
u/tribblepuncher
  • home
  • search

u/tribblepuncher

0 posts · 60 comments · 60 total

Active in: v/programming (60)

  • ‹‹‹
  • ‹‹
  • ‹
  • 1
  • ›
  • ››
  • ›››
Comment on: Report: 80's kids started programming at an earlier age than today's millennials

They weren't as easy to use, but they were far, far easier to program and understand. That in and of itself made a world of difference.

0 25 Jan 2018 15:29 u/tribblepuncher in v/programming
Comment on: Some day we won't even need coders anymore

Programmers will definitely be out of business within the next hundred years, but there will be a new need for something different that evolves from a computer programming itself. Though, it will be more niche.

I think that there may be a need for programmers, but fewer of them, probably highly and specially trained, and involved deep in the system. The fact that a computer can write most of the code does not mean that there will be no place for humans in the equation. Then again, they might make us entirely obsolete. This is a hard one to predict.

0 07 May 2017 05:36 u/tribblepuncher in v/programming
Comment on: Apparently, they don't want to hire people that knows about HTTPS.

Admittedly, that would at least be very unexpected.

1 01 May 2017 02:03 u/tribblepuncher in v/programming
Comment on: AGILE/SCRUM is litteraly the homeopathy of the developers community

"Pair programming" seems like it is a most horrific practice. I've never done it, but I've read about it, and every time it makes me cringe. Frankly, I think I'd end up slowly going crazy, whether I was the smart one or the dumb one in the pair.

3 23 Feb 2017 01:50 u/tribblepuncher in v/programming
Comment on: I just spent 12h in Delphi trying to fix fatal bug. At the end I had to check almost line per line until I removed a '\' character from a string.

PHP auto-creates the variable when you use it the first time, so I got no visible errors, but I also got wrong output.

This is one reason why I try to always declare my variables explicitly at the beginning of a function/subroutine, regardless of the language. Truth be told, I even hate it when a variable is explicitly declared, but is in the middle of code, as opposed to the start of the function.

2 20 Sep 2016 21:01 u/tribblepuncher in v/programming
Comment on: I just spent 12h in Delphi trying to fix fatal bug. At the end I had to check almost line per line until I removed a '\' character from a string.

Can confirm, when I was much less experienced I spent two weeks looking for a problem that ended up being an extremely tiny typo. I think this is a rite of passage for programmers.

One you keep repeating, no less.

1 20 Sep 2016 20:59 u/tribblepuncher in v/programming
Comment on: Teach Yourself Programming in Ten Years

After a certain point, you can learn the basics of a language in a weekend. In fact, after a certain point, you are often expected to in school.

That said, you won't necessarily be terribly great in it, and you may not be able to do much useful because of a lack of knowledge of the relevant APIs.

In terms of what languages are best at/worst at, that's largely a matter of interpretation. A few samplings:

C - high-speed, moderately to highly portable code that gets close to the machine. However, this can have some difficulty with some machine architectures in terms of optimization (you're thinking like a human, but the CPU is designed to do things better when working with other assumptions in mind, e.g. out-of-order execution). Additionally it gives you enough rope to shoot yourself in the foot, and in a lot of cases the specific libraries you're working with are unique to the platform and not generalizable to standard C.

C++ - object oriented version of C, 99% backwards compatible, to the extent that you've probably never actually used a true, pure C compiler except maybe gcc. It has a lot of the same advantages in terms of speed but it does have noteworthy increases in bloat, and the OO paradigm is not the best. It is also criticized for having too many ways to do things, which can cause problems in collaboration.

Java - Decent (not spectacular) speed, highly portable in both binary and source form. Moderately universal these days; it's hard to go wrong with learning it. However, the language constructs leave something to be desired, Oracle has its hooks in it and it's often a memory hog. Plus, Swing is sluggish and in some cases it suffers too much because it thinks it's smarter than you, which it usually is, but sometimes it isn't and when it isn't it bites you in the ass hard.

C# - similar to Java in many ways, but less popular and much more Microsoft-centric.

Lisp - Parsing, to at least some extent, and artificial intelligence applications, particularly genetic programming. However, it is very counter-intuitive to human thought, and most practical implementations have to be interpreted. Uses the functional language paradigm, which is extremely non-traditional (your code is essentially forming a series of parse trees; it's half tree-building and half writing equations, almost).

Perl - Good for regular expressions and pattern analysis. Was an early favorite for web scripting applications and is still quite capable to this day. But the language can do things in so many ways that it's inevitably a mess unless you're using the options of "warnings" and "strict."

JavaScript - about as universal as a language gets these days; it's deep in every browser. However, its structure is lacking (though nowhere near as bad as Perl), it is interpreted in almost all implementations, and the fact that it's browser-centric does limit it.

Assembler - In theory, the fastest, and also the only way to do certain things, particularly when you're writing firmware or BIOS-type code. In practice, you can shoot yourself in the foot a lot if you don't know how the processor uses individual instructions, e.g. out-of-order operations in order to properly fill the instruction pipelines. It is also non-portable off of most architectures, and is extremely difficult to read.

BASIC - Easy to read and a classic over the years. It used to be so ubiquitous that just about every PC had it in ROM. Forms the core of Visual Basic from Microsoft. However, it has a lot of different dialects and ends up over-simplifying quite a few things at times. It also has limits in terms of how "close" it can get to the machine, and often has performance limitations.

COBOL - In theory this is good for executives who don't know how to read and write traditional code. In practice madness and death await you (which is why there are so few programmers for it around, aside from them getting old).

FORTRAN - I know very little about it, but supposedly it's excellent for science computing and certain high-speed calculations. It has not been very popular for some time, though; certainly popular enough to retain interest, though.

Prolog - Works by setting up a logic system instead of a traditional program. However, it's a language you can redefine "1" in, so I wouldn't advise it except as a learning platform for advanced programming language theory.

All this is off the top of my head and I haven't touched some of these in over a decade. Python is one I didn't cover since my experience with it is nearly non-existent, and I also didn't touch a few other highly popular ones I have no experience with, including Python, PHP, Ruby, and Objective C. YMMV.

As a note, if you are looking for future projects to tackle, Java is probably the most immediate use for getting a job and closest to what you're used to, C is of great practical use in terms of learning how the machine actually works, Basic (specifically Visual Basic) can get something GUI-centric working with minimal experience, and Python is popular enough that it's probably worth a look as well.

3 13 Sep 2016 14:30 u/tribblepuncher in v/programming
Comment on: What programming language is good for a beginner?

Alright, that is very important to note, and I suggest you put it up in the post itself, because I think you're giving the wrong impression by noting the TI-84. You shouldn't take that out, but be sure to mention you DO have a computer.

Assuming your computer is reasonably modern your choices increase dramatically. So you don't want to do Python. Is there anything in particular you do want to do? If the answer is "I don't know," that's alright, but it helps to guide the answer.

Though a word of advice - if you're serious about programming, don't get TOO attached to any one particular language. There are many, many languages out there, and each has some things it is good at, and some things it is not good at (and in some cases is outright bad at). For now, it's important to learn how to put the logic together. Don't expect to do the fancy stuff right away. While there are some more specialized approaches you can take (such as using RPGMaker for games), those usually have limitations of one sort or another, and although you can reapply generalized concepts, it's not as straightforward as most would like and often requires learning entirely new syntax.

0 03 Sep 2016 17:34 u/tribblepuncher in v/programming
Comment on: What programming language is good for a beginner?

That's a nice big dose of "it depends."

In my opinon? C/C++ and Java. Java because it's everywhere, and C/C++ because it's everywhere and it gets close to the actual machine. There is a reason most kernels are written in C/C++.

There's also COBOL, since a lot of the original programmers of systems written in COBOL are, well, dying.

0 03 Sep 2016 16:28 u/tribblepuncher in v/programming
Comment on: What programming language is good for a beginner?

Lisp is a great way to frustrate OP into dropping programming entirely.

1 03 Sep 2016 16:24 u/tribblepuncher in v/programming
Comment on: What programming language is good for a beginner?

I do.

They're called "any Android app."

Java in general gets a lot of crap for things like, well, Swing, which is responsible for a good chunk of the lag in Java apps run on the desktop. It is noteworthy that Swing is not even available in the standard Android SDK (and I think it's pretty hard to find it for Android period).

1 03 Sep 2016 16:22 u/tribblepuncher in v/programming
Comment on: What programming language is good for a beginner?

People have hated Java since the day it was conceived and have been predicting its downfall.

It's now somewhere around 20 years old and is in darn near anything, including being the center of a multi-billion dollar lawsuit only a few months ago.

Java has problems, yes, but it also has tremendous acceptance. Whether or not it's the best solution is another story and open to interpretation, but don't follow the conga-line of people who generally like promoting their pet language or the latest fad language and scream that Java and C are the incarnation of evil in programming languages. The story is far more complicated and nuanced than that.

1 03 Sep 2016 16:20 u/tribblepuncher in v/programming
Comment on: What programming language is good for a beginner?

IMO, unless TI's languages have changed dramatically, then TI-BASIC of almost any stripe is a fairly poor way to learn programming, but it is acceptable if you haven't got a real PC to work with. As such, I suggest you start there, but realize that you're going to need to move beyond TI-84 at some point. You will probably rather rapidly outgrow it. With the TI-84, AFAIK your only choices are TI-BASIC and assembly, and I'm pretty sure you need to use external tools for assembly (and assembly is not something you want to cut your teeth on unless you have no choice).

However, one thing to note is that you are NOT going to be able to reuse TI-BASIC skills on any other platform. TI-BASIC is peculiar to TI's offerings only, and it is appropriate for a calculator - that is to say, not all that sophisticated. You'd be doing this for a) personal curiosity, and b) developing overall concepts, e.g. what an IF-THEN-ELSE structure is for. There will be a learning curve when you transition to something more sophisticated.

Once you get your hands on something better, I would consider C, JavaScript, or Java (distinct from JavaScript). JavaScript is probably the most forgiving of the three and is part of any major web browser, Java runs anywhere, and C lets you see how the computer really works in all its glory (if you want to call it glory, some would call it horror).

If you want further recommendations, I suggest that you give us ideas of what you do have access to. Is the TI-84 literally the only computing device you have? Do you have an Android smart phone? Do you have a PC of almost any stripe (and if so can you give general specs)? Telling us what other resources - or that you have no other resources - would be helpful. Simply saying "I have a TI 84" does not indicate if that's all you've got, end of story.

0 03 Sep 2016 16:12 u/tribblepuncher in v/programming
Comment on: Stackoverflow needs to be circumvented. rant + ramble

I don't think the "duplicate" complaint by Stack Overflow is (necessarily) as concerning as the "irrelevant" problem. Now hear me out because I'm not saying that duplicate is bad by any means, and this is only one person's experience.

However, I have found that, whenever I have a programming question, and many times I have a theoretical computing question, I find an answer that is at least somewhat useful is from Stack Overflow on the first page of Google results.

And pushing 80% of the time that question is in some way marked as irrelevant, off-topic, or otherwise stating "this question is low quality and/or beneath us." I get this strange, instinctive feeling that I am, by far, not the only one who experiences this phenomenon.

One thing I have noticed for some web sites on the Internet is an overwhelming zeal to improve their "quality" to the point it strangles the site. This is often in more private venues, but Stack Overflow is a very prominent and public example of this, and it's been like this for several years. They're increasing the "quality" and narrowing the "scope" to the point that, sooner or later, questions are going to be irrelevant.

Also, as a note, lest anyone say "it's been asked before!", just because someone asked it eight years ago does not mean it does not warrant a new explanation. It is unreasonable to ask someone to plunge through the archives of a web site for 12 hours in the vain hope of finding some question that's already been asked. Always has been. It kind of defeats the purpose of asking the question when they could beat their head against the wall for 12 hours while trying to read official documentation and/or textbooks and probably have about the same odds of "getting it." This particular problem is not limited to Stack Overflow, either, but as with "improve quality onto death," it is a prominent example of it.

Thank you for calling them out on their bullshit. I would have considered doing so a long time ago, but this attitude has made me want to only look at the site for questions and not actually participate (and be berated for doing so).

1 03 Aug 2016 00:13 u/tribblepuncher in v/programming
Comment on: Linux marketshare doubled since five years ago. From 1% (July 2011) to 2%, July 2016.

Actually there are a number of distributions that "just work" on a lot of hardware. I'm not sure what's wrong with your installation (see my other post).

One thing I have noticed is that Linux can have problems with really cutting-edge hardware, particularly graphical hardware, because the drivers haven't been fully written yet. Depending on your video hardware, that may be the problem.

0 02 Jul 2016 12:55 u/tribblepuncher in v/programming
Comment on: Linux marketshare doubled since five years ago. From 1% (July 2011) to 2%, July 2016.

Can you provide any more details on the precise error messages, and your video hardware? Also, how are you getting the error messages if it won't show the display - are you getting a text display and command prompt, at least?

I feel that this is something that can probably be worked around, at least enough to fix the problem and get proper support. I'm thinking that passing parameters to the kernel would probably solve the problem, but I'm not sure of that.

Also, I would consider trying Knoppix, which is designed to run on a lot of computers in emergency situations, or a Red Hat-based distribution, probably OpenSUSE.

1 02 Jul 2016 12:54 u/tribblepuncher in v/programming
Comment on: Which programming language to learn first [infographic]

Was coming here to post this. I've seen this chart before. It does not impress.

"The easy way? Python. The best way? Python." No bias there, nosirree.

1 15 Jun 2016 03:42 u/tribblepuncher in v/programming
Comment on: Why can't programmers... Program?

This depends on if you mean 'useful' for the mathematical aspects, versus 'useful' for the practical ones.

For both, the ones that you named are pretty important. For the math branch, you are probably going to have to go into computer languages pretty in-depth in a lot of ways. Finite automata, for instance, is a course that goes into some of the most basic elements of computing, which are applied to both programming language processing and computing in general. Programming languages are where a lot of the theory ends up coming together for a practical purpose.

For raw usefulness in terms of putting together useful programs, I would consider networking, operating system pragmatics (not an OS-building course, those are a whole other kettle of fish), programming languages (not learning the languages, learning about languages, e.g. intro to language theory), and possibly software engineering and/or software testing. Definitely take a project management course. These days you may also consider concurrent programming, if it's available, and be sure to take at least one course on assembly language. You need to know what's actually happening under the hood; relying on Javascript frameworks your entire career is a road to being a crappy programmer.

For the mathematics branch, language theory, algorithms courses, formal methods and proofs of correctness are all pretty important. I should note that I am not as familiar with the mathematical portion as I perhaps should be. Simulation, at least involving theory, is another one to put in this category as well.

For both "tracks" I would say finite automata is an absolute must, and AI is something that should be considered as well.

Keep in mind that I'm pretty rusty so take this advice with a grain of salt, but off the top of my head that's how I'd arrange it, roughly speaking. Also keep in mind you are likely going to have to learn things you'd rather not learn - a CS degree is generally going to be broad enough that you can do a wide variety of things (and if it isn't you would probably be best served in another program).

1 31 May 2016 05:13 u/tribblepuncher in v/programming
Comment on: Why can't programmers... Program?

That's an example of a good class. Probably learning programming language paradigms. Did you also examine basic parsing/parse trees? Some language classes end up being an overview of the entire subject of the programming language, which can be pretty interesting in and of itself. Especially since it is in many ways a follow-on to the whole "theory of computation" that you start with when using finite state machines and the like.

1 25 May 2016 12:45 u/tribblepuncher in v/programming
Comment on: Why can't programmers... Program?

A lot of the best programmers out there come into the program already knowing at least some programming. This program doesn't require a degree of any sort, and can probably be solved by anyone who bothers to read and understand a beginner's book, e.g. Language-Of-Your-Choice For Dummies.

2 25 May 2016 07:24 u/tribblepuncher in v/programming
Comment on: Why can't programmers... Program?

The problem is that there is a fundamental misunderstanding of what CS actually is.

In theory, CS is essentially a mathematical approach to analyzing problems and developing algorithmic solutions to those problems. Computers are incidental to the entire field. In fact, many of the hard-core "computer scientists" tucked away in certain places have little to nothing to do with a real computer. Djikstra barely used one.

In practice, CS is seen as a "programming degree," except more hard-core than MIS, which tends to be a hybrid between business and computer pragmatics at a lower level and several different things (including behavioral analysis with technology) at the high research level.

The biggest problem is that there is no real "in-between" in computer science that designates "this is computer science, programming edition" and "this is computer science, math edition." And considering most places with "computer science, math edition" would dry up since most CS majors I know of are in it for the technology, not the mathematical theory, they're probably not going to go out of their way to make that distinction any time soon. To the chagrin of everyone else, unfortunately.

The trend I've noticed is that the more "high-end" a computer science program is, the less you actually learn to program and the more time you spend on hard-core math classes. As a result, these guys tend to publish crap in the general direction of one another that is sometimes useful, but in a lot of ways is about as practical as any other specialized math degree. Meanwhile, the ones at the low end tend to end up having a tendency of over-dependence on specific frameworks, while never emphasizing the more important and fundamental issues.

IMO a lot of the really "good" CS programs are in the middle of this, because you learn some theory and some programming. However, even this model has its problems. For example, a lot of the time you don't learn much about graphical programming, and it can be difficult to see the use of what you're learning if it looks like you're poking around writing programs to run on a text terminal that could run just fine on a display from 1975 and everyone else uses GUIs that are built by a tool. Of course, the one who's making the program is arguably learning a lot more since the text-only output is just a simple way of verifying that the underlying code is doing what it's supposed to, but it sure doesn't look like that. Not to people on the outside and not to a lot of people on the inside.

10 25 May 2016 07:19 u/tribblepuncher in v/programming
Comment on: Programmers how do you tackle the feedback problem?

You may want to consider putting telemetry in there - with one huge, HUGE caveat.

IT MUST BE OPT-IN ONLY.

Otherwise what is a debugging tool for you becomes spyware for them.

I don't know if this would give you the feedback you're searching for, but it might help in any case.

One additional thought - let people know their opinions matter. There's an MMO that I am on that had a very large change. I didn't comment on these changes, though, even though I had ideas, because I felt that nobody was going to listen to me. Considering how much response I got from other feedback I've sent in, I was probably right. Replying to feedback may help with this, or at least, would make it somewhat more likely that someone who gave feedback once will give additional or more detailed feedback in the future.

1 01 May 2016 20:37 u/tribblepuncher in v/programming
Comment on: The Lie That Has Beguiled A Generation Of Developers

I'm growing tired of hearing people say how much they hate C because it doesn't directly support and/or incorporate and enforce someone's favorite <insert-bloatware-or-abstraction-of-choice-here>. C has deficiencies, but anyone who's honest with themselves and actually knows what they're talking about knows that all languages have deficiencies, often severe ones, including their own pet language and pet programming model.

1 01 May 2016 13:04 u/tribblepuncher in v/programming
Comment on: Have Software Developers Given Up? (an interesting read and so are the comments)

The failings of programmers has little to do with math

Agreed 100%. I hate math, and I struggled with it (possibly due to poor teaching - very long story there), but I'd like to think I'm a relatively decent programmer at least. Being able to put together logical structures has always been more valuable than pure mathematical skill. While these two are related, they are most certainly not identical.

Also:

that almost everyone that was programming had some personal investment

Most (not all) programmers that really know WTF they're doing started in their teens or tweens, or in some cases even earlier. I think this has something to do with brain formation. But if you don't like what you're doing, at least a little bit, you're probably not going to be able to put out much good code. There's too much logical detail required, even in languages that are supposed to do all the heavy lifting for you and where you have most of the components pre-written. If you're going to be doing anything but the most trivial of work with these, it's going to require some logical skill to put together without flaws - even with logical skill there might be flaws, since a lot of these bugs can be pretty darn subtle, especially dealing with security.

learned how to drag and drop forms onto a UI builder is proclaiming themselves a programmer and using university degrees to back that assertion up

Some of the university programs I'm familiar with in computer science don't even teach anything drag-and-drop in the core courses. At least a few have a lot of their work done on paper. This probably helps distinguish actual computer science majors from people who just have generic programming degrees for the language du jour.

4 28 Apr 2016 03:29 u/tribblepuncher in v/programming
Comment on: Have Software Developers Given Up? (an interesting read and so are the comments)

I blame a lot of this mentality on the thought that you can patch anything, anywhere, regardless of the innumerable problems it causes.

I remember a time when the thought of "patching" a console game was pure and utter nonsense. It wasn't just that it didn't happen, it was that, for all practical purposes, it was next to IMPOSSIBLE, so the games, for all the infamous glitches found in the last couple of decades, were usually pretty solid for most people. Now it's a regular occurrence.

Unfortunately "patching" involves all sorts of serious problems, including fucking up the system worse, and sticking "goodies" in there that the software manufacturer wants to force you to have whether you want it or not (e.g. de-rooting Android phones, spyware, etc.)

There are a lot of things contributing to this sorry state of software development, but I think that's a huge one that people are just ignoring. It may even be something of an elephant in the room, for that matter.

3 28 Apr 2016 03:23 u/tribblepuncher in v/programming
Comment on: I've reached a point in my project where I think I could get my coding skills critiqued. Where (besides here) could I post my project to get feedback?

You're assuming that it won't be immediately shut down as off-topic, since pretty much everything useful is.

0 20 Apr 2016 05:20 u/tribblepuncher in v/programming
Comment on: Somebody is injecting code into the SNES by playing super mario world

This is not all that uncommon. It's been a feature at the AGDQ and SGDQ marathons, among others. It's a pretty awesome to see, though!

1 29 Mar 2016 02:05 u/tribblepuncher in v/programming
Comment on: What's is the best text editor to use from command line?

I'm gonna be the guy who's different and say that I prefer 'joe'. I got it when I first cut my teeth on Linux in the 1990s. It was a very popular text editor to include on the distributions available back then, including the micro-distros that sat on UMSDOS (think along the lines of the rough equivalent of a modern Linux Live CD).

Joe is not as powerful as vi or emacs by any means, but a lot of the time, you don't need that much. At the same time it's a world better than nano/pico/etc., in my experience.

That said, I'd love to see a text editor of the style of the old Borland C++ text-mode DOS IDE's, available for Linux and Win32. I don't know how likely that is, though, since I'm pretty sure nobody bothers to use GPM (a package for mouse interaction at the console) anymore, though there are a few packages that seem to be able to use the mouse via X-based terminals for operations other than copy-and-paste.

1 18 Mar 2016 02:02 u/tribblepuncher in v/programming
Comment on: As a Linux Developer, do you find the GNOME or KDE ecosystems better and why?

DISCLAIMER: I have not produced a lot of code for Linux in some time and was never very involved with either GUI system at all, so my information is outdated, and a lot of this is based on old memories since I haven't had an opportunity to seriously diddle with some of this stuff for quite some time.

However, to summarize: a pox on both their houses.

Let's start with KDE, which has had my ire since the get-go. They based their system on Qt. This basically requires application code to use C++. This is IMO a very poor way to go; actually I think it's flat-out obnoxious. It isn't because C++ is so horrible, per se, but because it locks in the choice of language a lot more readily and impacts performance. Performance-minded programmers now have that much more crap to go through. You can't work directly with it with C, and C++-to-C bindings are a great way to have the disadvantages of both with the advantages of neither. People who insist that "computers have enough memory/processing power/etc. to deal with <insert pet bloatware here>" are often right, right up until the 700th abstraction layer is slathered on to the PC, and like a bran muffin that is coated in a half a pound of butter and lard, suddenly things aren't what they were supposed to be and people are back to dealing with the bloat, either physiological (for the muffin) or in terms of their computer's lag or memory consumption. Also, in my experience, C conventions are a lot more universal and flexible when integrating with other languages, which helps if you're writing a program in, say, Pascal, for some reason, if you can interface with a C library as opposed to some C++ binding or something.

I am also not too fond of forcing an OO paradigm on to that level of code, even though I'm certain someone reading this is tearing their hair out and screaming "object oriented GUI code is the One True Path you mouth-breathing neckbeard!" and probably calling for my head on a platter for their mantle right now. Yes, it is, but that can be brought to bear in C as well (not very easily, but effectively nonetheless), and the lower level you get the more flexible you should be for the needs of application programmers. I consider the base code of the UI to be pretty darn low level for most application software. The use of the Qt library is also questionable at best, since it is/was a) proprietary, b) a big point in the whole C++ thing, and c) forced the use of the GPL unless you could cough up $2000 or so for a license that let you do what you want with it. I am of the opinion that the basic tools of development on any platform are generally useless if they force you into a particular license, and that right there is an example of it. As I understand it, Trolltech changed the rules so the license implications are no longer the same, but the fact that the project decided on this from the beginning does not make me particularly inclined to trust its decisions in the future. Additionally, my experiences with KDE fans are that they crow their system's superiority in a way that is very loud, self-righteous, arrogant, assumes universal superiority, and is fundamentally obnoxious. I am not saying most advocates are like this, and I am not saying GNOME doesn't have them, but I've run into more KDE advocates like this, which leaves a very poor impression for the community.

GNOME, on the other hand, was good during GNOME 2, but eventually developed a standard practice of lobotomy-as-upgrade, wherein it is apparently undesirable to allow you to customize your screen saver, probably because it's "too hard" or some other ridiculous reason. This is usually dismissed as being able to do some obnoxious and obscure hack to a configuration file somewhere, but that is utterly ridiculous and reeks of Firefox's attempts to bury good features alive, and then kill them once the fuss has died down. GNOME's design has also apparently shifted far more towards tablets and the like, which is a trend that seems to do a great job of pissing off people who don't want to use tablets as well as those who do want to use tablets but want to be able to use a desktop OS when they have something more serious to do than play a game or browse the web on the bus. Additionally, the GNOME tentacles have grown quite a bit further than they should have, and now the ecosystem is mingling with systemd, which is basically slowly forcing the use of a large, bloated, monolithic software block as the "standard Linux environment" that is filled with software that a very large number of Linux die-hards seriously question. The fact that it seems like nobody is really making the decision here but everyone is mindlessly following it is alarming at best and makes me seriously question if something is very rotten in all this. It is especially troubling since a lot of those involved seem to be very disliked and for good reason, which bodes ill for future community efforts. GNOME as an organization has serious problems as well; aside from its questionable design practices, it also seems to be more interested in using its funds to promote pet political projects rather than write the code those funds were donated for in the first place, and as such I question its ability to sustain itself, since I don't think too many people are going to be opening their wallets again after that little stunt.

Now, what is my solution? Unfortunately as with so many things these days in software, there probably isn't a great one. At this point I would either use a toolkit with portability that hopefully lets you compile against both easily, or write my codebase to thoroughly decouple business logic from the user interface. If I had to choose one I would pick GNOME, because it seems to give the developer more choice and is more popular, but I would not be happy about it.

5 16 Mar 2016 23:24 u/tribblepuncher in v/programming
Comment on: Insider comments on the state of affairs at Mozilla and predicts it's demise

How is IceCat? Is it so concerned with being philosophically pure it ends up crippling itself for anything practical? I mean that seriously and not sarcastically; that sort of thing can and does come up with some free software projects (Hurd being the most glaring example).

0 07 Mar 2016 19:17 u/tribblepuncher in v/programming
Comment on: Insider comments on the state of affairs at Mozilla and predicts it's demise

This is not entirely a bad thing. More web rendering software is important, true, but there are other things that could well be equally important, such as compatibility with extensions and a lot of customization options. When I use Firefox, it's usually not because I am enamored with Gecko, it's because it does things (for now) that other web browsers won't, usually having to do with customization.

0 07 Mar 2016 12:08 u/tribblepuncher in v/programming
Comment on: File extensions: .gz, .tar and tar.gz

I think in this case it's a combination of factors. While, yes, the three letter extension has a major impact, there are also two other things to consider.

First and foremost, encapsulation. Encapsulation can be a difficult concept to actually put into practice if you're not versed in computers in the first place, even if it seems simple enough. GZipping a tarball is pretty much a form of file encapsulation, which isn't something that is quite so evident. Although archival files are a form of encapsulation in themselves, it isn't quite the same, since it's more of a directed encapsulation as opposed to a natural encapsulation, and it doesn't use the filename to denote the difference.

Second is the fact that Unix separates concepts so distinctly. Archival work is traditionally done on many platforms as a single, combined archive/compression step, whereas Unix does it in two independent steps. This is not the most intuitive thing in the world, especially if you're not used to the "Unix way" of having each program do one job specifically and very well. It's not obvious that someone might have thought to do it in two cleanly separated pieces.

0 04 Mar 2016 07:43 u/tribblepuncher in v/programming
Comment on: File extensions: .gz, .tar and tar.gz

Winzip, really?

Yes, really.

The line between Windows-centric and Linux-centric archive and compression have grown considerably more blurry over the last decade or so. This goes both ways, including some files intended for Linux use being put into a traditional .zip file. While you are unlikely to find Windows binaries by themselves in a tarball, source code and data are often transmitted in this manner, sometimes specifically for a Win32 environment. And if you're downloading source, even if you know C/C++ in and out, you could still pretty easily trip over this if you haven't had experience with Linux, and a lot of people download sources for self-compilation for reasons other than modification of the actual code, e.g. compiling in custom options or attempting to increase security controls.

3 04 Mar 2016 07:36 u/tribblepuncher in v/programming
Comment on: File extensions: .gz, .tar and tar.gz

This is due to the way that Unix was traditionally designed.

Essentially, most of the commands in Unix were put together in a way that they - in theory - did one thing and did that one thing well. As a result, there were two separate programs for compression and archiving. This made particular sense at the time since tapes were in heavy use at the time, and the tar format - short for Tape ARchive - was made to be usable with them. Furthermore, disk space was especially tight, so saving overhead from archiving a single file as opposed to simply compressing it by itself may have been worth it in a lot of cases.

In a traditional Unix environment, you will usually actually run a .tar.gz through BOTH programs, even if you only invoke one command; tar is capable of directly using GZip to decompress the archive as it extracts it, if given the correct command parameters. Hence why it is .tar.gz, because it is an archive file that has been compressed. This combination is frequently known as a "tarball." The extension .tgz, as someone else noted, is short for .tar.gz. This isn't so popular anymore but it was very popular in the 1990s, especially for users of Linux or BSD that might have to interoperate with a traditional DOS environment heavily, particularly Linux users who might be operating on top of a DOS filesystem (see: the late, lamented UMSDOS file system option, popular on Linux distros circa 1994, long ago having rotted its way out of the kernel source code).

In this vein there are at least two more variations that might be of interest. First and foremost is BZip2. BZip2 is an improvement on the traditional algorithms used by GZip. This results in the following file extensions:

  • .bz2 = single file compressed
  • .tar.bz2 = .tar file that has been compressed
  • .tbz = shorthand for .tar.bz2

Additionally, to make the whole thing even more complicated, there's the traditional Unix compress/uncompress scheme, which used the .Z extension. These are not terribly common any longer, but AFAIK most distros keep compatibility around for them because the utilities to work with it are simple and lightweight. These provide the following extensions:

  • .Z = single file compressed
  • .tar.Z = .tar file compressed
  • .tZ = shorthand for .tar.Z (I think, but I haven't seen this one much, if at all)

Note the capital Z for this scheme - as with all Unix filenames, the case makes a difference.

As a note, 7-Zip's graphical UI is IMO one of the better ways to work with .tar files in a Windows environment, although you have to get used to it since it doesn't quite work like most traditional archival program GUIs. That said, it may be worth your while to learn how to use gzip, bzip2, and tar on your own. I've found that you can manage to squeeze quite a bit of data onto a storage medium when you're short on space by selectively gzipping a few files - this used to be common enough, in fact, that many older web browsers would automatically decompress .gz-compressed text files and display them as though people had just downloaded a text file, e.g. myarticle.txt.gz was treated like myarticle.txt, but only took up the compressed size of myarticle.txt.gz in terms of transmission time.

2 04 Mar 2016 07:29 u/tribblepuncher in v/programming
Comment on: ReactOS - Fake Or Potential Windows Alternative? Review And Extended Test Drive Of Latest Release

If they're running ancient apps, they will probably be better off using an older Windows in a VM. Supported or not, it is still probably going to turn out better than ReactOS. Wine may do better than ReactOS, but probably not by much. There might be a few corner cases where this is different, but for most consumer and desktop software, this is pretty much what you can expect.

0 03 Mar 2016 04:09 u/tribblepuncher in v/programming
Comment on: What is the fast track to developing for Linux?

One additional note that Voat didn't like me putting in (post exceeded 10KB):

In terms of installation and libraries, BE PREPARED FOR HEADACHES. Linux is something of a minefield for installation, and in particular, library woes can sometimes be extremely difficult to solve, to the point where it isn't worth trying any longer for the user. Try to avoid that fate for your users when choosing libraries, and be very clear about which ones to use and which versions to prefer. This isn't to scare you, and you'll probably be fine, but this sort of thing can and does happen, so watch out for it and be sure to test running binaries and compiling on other distros. IMO, Linux software installation is a vast, gaping hole in the armor of its push to the desktop.

1 17 Feb 2016 17:10 u/tribblepuncher in v/programming
Comment on: What is the fast track to developing for Linux?

First off, DON'T target your program for a specific distribution. That very significantly limits your audience and the utility of your application, especially if you're going out of your way to use distribution-specific features. That said, I don't think it's terribly easy to go about doing this, so you should probably not worry too much about it. There are SOME things you're going to have to choose on - probably the biggest being Gnome vs. KDE vs. generic - but that's not nearly as dire, especially since most distributions have the files for both for compatibility (and they also have the bitching from the fanatics of both for doing so).

Second, I am going to warn you that my knowledge is very dated, and I focused almost entirely on C. As someone pointed out, C++ is something of an 'odd man out' for a lot of things on Linux, although some very noteworthy subsets of the Linux ecosystem use it extensively, particularly anything relying on KDE and/or Qt.

In terms of the file structure for a project, that's going to depend on your development methodology, preferences, and the tools you use. There is no one, single way of doing it. There are projects that compile multiple libraries and binaries and all run out of a single directory, there are projects that use a vastly complicated array of directories for source code organization, compilation, and supporting files and documentation, and everything in between. You're best of choosing whatever is most appropriate for your situation. However, I have noticed a tendency for there to be a bin subdirectory for compiled executables, a lib subdirectory for compiled libraries that are a part of your project, a hdrs or include subdirectory for header files, and a src subdirectory for the actual source code. Each of these may have their own subdirectories, particularly the src and header subdirectories.

In terms of tools, there are a ton of them. The most universal is 'make' and learning to use makefiles is a good thing. Someone mentioned autotools/autoconf, which can be very important for determining the exact specifications of your target system, and this is important if you are planning on distributing code that the user is expected to be able to compile. In fact, make is very critical, such that there are a lot of projects that use a text editor (most often vi or emacs), the shell, and make, and nothing more. Makefiles are sometimes even used for installation and uninstallation of binary packages without any source involved. You will probably want to use a version control system; git seems to be all the rage these days. I don't know what the status of CVS is, although I suspect it's still serviceable, if not the most spiffy and up-to-date.

However, more advanced IDEs are also available, including a few available on Windows, such as Netbeans and Eclipse (both of which can use C++ even though they are Java-oriented). A good list of some of the most popular is here: http://codecondo.com/10-best-ides-linux/ A more complete list is here: http://linuxmafia.com/faq/Devtools/ides.html And there's plenty more where that came from.

Libraries to use - there are a ton of them. The most important is probably the windowing toolkit, assuming you use it. Your choices are myriad, and you're going to have to do some research. GTK and Qt are the most common these days, but there are plenty more, and you really should examine what's available. I would NOT base code on the basic X11 libraries, nor Xt. These are the raw API to interface with X11, and a sample toolkit, respectively. X11 API is difficult and hard to use for most situations, and usually best left to those who write windowing libraries. Xt was, I believe, a sample implementation of how to write a windowing toolkit that kind of took off on its own, much to the chagrin of X11 users throughout the late 1980s through mid-1990s, and is probably best buried. Motif is also a name you may hear, and it is best not to bother with that, either; it was based on Xt, and although it was THE standard for high-end X11 across the industry in the 1990s, it's basically faded into next to nothing these days with the rise of alternatives such as GTK/Gnome and KDE/Qt.

In terms of how to distribute it, the standard is usually, to cover all bases, aim for a .deb (probably based off of Debian or Ubuntu), .rpm (probably based off of Red Hat, which is known as Fedora in its non-commercial version), and tarball (for everyone else). If you plan on distributing source-only, a tarball is usually what you'll be using anyway. Where the files go, however - they usually go in some fairly generic directories. The specific standard is here: https://wiki.linuxfoundation.org/en/FHS However, for a brief overview, /bin is for important programs, /etc is for configuration files (you should probably not use this unless you're building something important for the system itself), /lib is for important system level libraries, and /sbin is for administrative tools that are extremely critical. /usr is where most of the regular programs go; most of your binaries will go in /usr/bin, and /usr/lib is where most libraries go. /usr/local is where people go most often to dump in their binary tarballs or custom-compiled applications. /usr/X11* are a series of directories you likely have little need to go to, so I won't cover those, other than they tend to be low-level X11 stuff.

As someone else mentioned, if you are distributing source, consider using automake in order to generate makefiles on the fly. I would very strongly encourage you to pick up several distributions and run them off of virtual machines to test installations on all of them. As a decent mix to test with, particularly for a "serious," commercially-inclined application, get Debian, Ubuntu, SuSE, Red Hat, Mint, Arch, and Slackware (for the lowest common denominator), and whatever else is high on the distrowatch.com charts. At a bare minimum, test against Slackware for compilation. Please do not jump me, fans of other distros, I just want to give something to start with. You may also consider some up-and-coming distros such as Elementary and Zorin. Note that most of this does not apply in the same way for Android; it's a Linux and uses the Linux kernel, but there are numerous differences in userland, which is where you'll be spending almost all of your time (unless you want to get into systems programming, in which case, you're not going to be using C++, but straight C, no realistic negotiation there, end of story).

As per uninstalling, the major package managers uninstall things for you, don't worry about that. Uninstalling without a package manager usually consists of deleting whatever you put in, possibly leaving user configuration files.

Other languages such as scripting languages are also possible, but they have their own ecosystems that have their own norms and the like. These include Perl, Python, Ruby, TCL/TK, and a variety of others. Note that with many of these, they tend to be somewhat trendy and come and go, so I would advise you to consider this when choosing your language, should you opt for scripting. I may be dating myself by even mentioning TCL/TK, in fact. Other languages, such as Java, are also readily available. I believe FORTRAN and Pascal are also available. NASM is available for Linux, but you'll have to look up the information on using that, and I don't advise it unless you want to write for the kernel or extremely low-level libraries (which would be an extremely bad idea at your level of experience).

As a note, depending on your goals, you will likely want to learn more about Linux architecture and Unix-specific programming conventions, such as signals and raw sockets. I do not have the name of a good book off the top of my head. Note however that this is fairly advanced and you may not need it right away, so while it is very helpful to know this stuff, do not overload yourself trying to learn how to write signal handlers, especially since, depending on the libraries you use, you may never even be able to touch them directly anyway. Learning a bit about X11 architecture is also good, although the specifics of writing code specifically for X11 is probably not going to be too helpful, as it is a very rare situation that you would want to do something that couldn't be handled by any mainstream full windowing toolkit. X11 may be on its way out anyway.

As per testing, I'm afraid I can't help you there, but debugging has one king on Linux - gdb. Learn it. A lot of IDEs, if I recall, have debuggers that are actually front-ends for this. HOWEVER, gdb is not the only option. There are a number of debuggers, some of which are much more visually-oriented. Some of these are here: http://www.drdobbs.com/testing/13-linux-debuggers-for-c-reviewed/240156817 and here: http://toppersworld.com/top-10-visual-linux-debuggers-for-c/ but I cannot vouch well for any of them.

Also, one last thing - if you have specific licensing needs, be sure to research the EXACT specifications as per the licenses of the GNU C++ standard libraries. I think there may be some licensing oddities there, but I am not sure, and if it was there before, I'm not sure it is now. In fact, be aware of the legal nuances of any libraries you do use. Some are under various free software licenses that will require you to release your code; while most of the standard libraries are not really impacted by this, some could well be, particularly niche libraries. You are unlikely to need a lawyer for anything non-commercial, but if you are in doubt, ask.

DISCLAIMER: Once again, my opinions only, much of this info is old, YMMV, I didn't do a lot of research for this. I am 100% sure I forgot things that people are big fans of; to them, my apologies, but this should show just how wide the Linux world is, and how many possibilities there are. Good luck.

2 17 Feb 2016 16:54 u/tribblepuncher in v/programming
Comment on: What constitutes 'coding'?

Yep.

In fact, Java is in fact a real instruction set that has actually been implemented in silicon.

https://en.wikipedia.org/wiki/Java_processor

There are several, and I believe there were quite a few more about 10-15 years ago when Java was really getting traction. Java is, essentially, running in an emulator (hence the acronym, Java Virtual Machine). In the case of some things, such as Swing, I believe there are specific interface hooks and the like; not quite sure how that translates out to actual processors that use the Java bytecode as their assembly language, although I imagine most of them aren't using Swing in any event so it's probably a fairly irrelevant question. This also brings up an important question - does an instruction set become interpreted if it is emulated, even if it already exists in silicon? If so, then probably every major architecture on the planet is interpreted at that point. Hence the "grey areas" I mentioned.

0 16 Feb 2016 09:36 u/tribblepuncher in v/programming
Comment on: What constitutes 'coding'?

Compiled language that runs on a virtual machine. And in a few cases, the virtual machine was actually made to work on real CPUs. So, coding.

That said, there is a lot of grey area in many areas, particularly the subtle ones, such as Perl, which, IIRC, compiles and then executes from memory.

0 15 Feb 2016 17:30 u/tribblepuncher in v/programming
Comment on: Anyone know what happened to Dev-C++ and Orwell?

So basically Dev-C++ was a pre-configured front-end/IDE for MinGW that brought MinGW along for the ride in a single package?

Huh, TIL. I was under the impression they put together their own Win32 GCC port and made an IDE for that.

0 09 Feb 2016 03:12 u/tribblepuncher in v/programming
Comment on: Anyone know what happened to Dev-C++ and Orwell?

Actually there's at least one other reason why Dev-C++ was free - and that's because it was a very straightforward distribution of GCC for Win32. At present, so far as I know, there's Cygnus, which relies on a bunch of other stuff (since it's part of a simulated Unix environment), and there's at least one more that requires you to go through some kind of GPL authorization for anything you write using its copy of pthreads or something (I'm still not sure on the details on that one). I think there might be a few other "fringe" distros. The great granddaddy of Windows/DOS GCC ports, DJGPP, probably hasn't worked on a stock Windows system in years. But, long story short, there just flat-out aren't many GCC options, and Dev-C++ is the one with the least fuss.

Plus it's pretty simple and straightforward - even if Visual Studio is free, it can't be accused of that. I also imagine that fewer Linux/Unix libraries are available for it, and porting is a more difficult job without it.

0 09 Feb 2016 02:27 u/tribblepuncher in v/programming
Comment on: Which Programming Language Should You Learn? The Infographic to Code It All

While this may be useful to some people, half the chart basically points you to Python unless you've got some special case. This is a very biased perspective. There is a reason why there are so many programming languages out there, why there are so many non-Python languages out there, and why so many have survived for decades.

1 05 Feb 2016 02:58 u/tribblepuncher in v/programming
Comment on: Which Programming Language Should You Learn? The Infographic to Code It All

If you think this is bad, wait until all of the grade schoolers who "know how to code" start hitting the market. Not just anyone can program, and forcing it down their throat can also drive off people who would otherwise have been fascinated by it.

1 05 Feb 2016 02:55 u/tribblepuncher in v/programming
Comment on: Your opinion about Java: now and in the future on the labour market

Java is said to be the new COBOL. It is a rather hot language with a number of advantages, not the least of which being its inherent portability in its ecosystem, which actually allows a lot of other languages to be built to emit bytecode compatible with it. You are not really wed to the language itself.

In terms of the language specifically, I hesitate to say 'it sucks' because although it has a number of very annoying quirks, it's still C-like and I don't buy into a lot of the languages that do things "correctly" but make thinking and working in them far more difficult. However, it does leave much to be desired, and is naturally expected to be a "Java developer."

I'm not sure how much of a future it has for desktop applications, though. Common advice for home users is to not install Java for fear that it will cause a dangerous exploit. IMO those fears are somewhat overblown, but not baseless. The few Java compilers that claim to produce "native" code (e.g. Excelsior JET) have plenty of junk in the trunk, e.g. a 1 MB .exe file with 80 MB worth of .dll's dragged around. Others that produce EXEs, to my knowledge, tend to basically just cram a JVM in with the bytecode, which produces elephantine executables without a real performance advantage. Additionally Java has bestowed its programs with the gift of ridiculous bloat, unless you nail down the memory configuration hard. It may be good for the Java application if the JVM seizes 1 GB of heap, but it sucks for everything else on the machine, especially if a ton of it goes unused and the garbage collector seems indifferent to the problem. Additionally, the common refrain is that Swing sucks. I don't know enough about JavaFX to comment, but the fact that it isn't in OpenJDK (AFAIK) seriously hurts it, because things that are not in Oracle's grasp are good for Java.

Honestly, the greatest threat to Java's survival is Oracle. They have chronically shown themselves to be a ruthless, destructive, short-sighted corporation, and IMO one of the main reasons Java has much of a future is OpenJDK meaning that developers are not inherently connected to the stock JRE/JDK. The two markets keeping Java in one piece are corporate server-type applications and Android, and Oracle is more than happy to not only bite the hand that feeds them (Google), but attempt to sever and devour the arm and any other parts they can get their mits on.

Despite whatever Oracle may do, Java is highly unlikely to go away for a very long time, not unlike COBOL, especially since (or so I am told) a lot of it runs on mainframes, which are still in use in large corporations for important "never-go-down" tasks, and they have a selling point of backwards compatibility into the 1960s. If I were you, I'd go for it. Even if you don't use it, if you're not experienced with programming, learning another language can give you a better perspective for future projects.

3 04 Feb 2016 12:04 u/tribblepuncher in v/programming
Comment on: Remix OS - Android-based OS For Desktop: Extended Review, Video Demonstration and Installation Instruction

Among other reasons, because Android tends to be a lot friendlier than Linux and BSD. Plus, Android has potential as a desktop OS, and considering the clusterfuck of Windows 10, now is a good time to be getting a more user-friendly iteration of Unix available for something other than Apple hardware.

And, lest anyone say it, yes, I am aware that grandmas use Ubuntu. But not everyone uses their computer like grandma, and most people will eventually want software that doesn't come in the repo (which as far as I know stays pretty much the same forever as soon as a particular release is out-of-date anyway). Plus, although it does not leverage the software library of Windows, it does leverage the software library of Android, which is pretty big and fairly user friendly for the most part.

Actually I wouldn't be a bit surprised if it ends up having some level of Windows compatibility at some point, now that I think of it.

0 15 Jan 2016 10:35 u/tribblepuncher in v/programming
Comment on: Elementary Human Interface Guidelines - Advice on UI and software design

A lot of software does work without configuration. You can, for instance, pick up Word or LibreOffice and type a letter out without configuring anything after it's installed. However, it's not going to work nearly as well, or fully exploit its capabilities.

That said, this may just be a tip - or a "tip" - that tries to lobotomize the software to satisfy some UX point or another, resulting in crippleware (that sometimes gets to the point where the software controls you more than you control the software). I think it's about time that some of these design people start realizing that the software's default or "smart" settings are often not what the user wants or needs, and there should be ways for the user to override them. Unfortunately a lot of software, particularly on mobile platforms, seems to be deliberately discarding this notion. I think we're going to suffer for it.

0 15 Jan 2016 02:42 u/tribblepuncher in v/programming
Comment on: help me improve

Well, first off, a caveat - as I said, I haven't actually looked up software project management in a long time. However, with that in mind, I would have to say I favor the "spiral" model, primarily because it is more realistic. Waterfall is the "basic" software development lifecycle (SDLC) model - it's pretty straightforward, in terms of design->code->test->deliver. However, it's pretty static, and if you go by it and it alone you are going to be hitting your head against a brick wall due to your own process. Spiral is something like an iterative waterfall model - design->code->test->design->code-test->design->...->deliver. While this is more arbitrary, it's a lot more flexible, and to me seems to be much more adaptable. Agile methods are a general category of SDLC approaches, and some of them are very good at producing reasonably good code very quickly. However, their advantages can often be double-edged swords. For instance, they often involve having coders embedded with the customer (customer being the generic term of whoever you're designing for), but this may not always be the most practical thing. Another example is pair programming, which assigns two coders to one computer to look out for one another. While this may lead to better quality code, I personally believe that can not only be very frustrating, but if you have a team that does not necessarily jive well with each other it can backfire big time. I believe that you would benefit from looking up a book on SDLC models; unfortunately I don't have the name of any in particular. Do know, however, that there are a lot of them, and sometimes people who are into these go way off the deep end - for instance, I recall one research paper that described SDLC models as corresponding to specific political systems. Most approaches have their time and place, and being familiar with a lot of them is necessary, even if it's only in a general sense.

C and assembly are very much worth learning in and of themselves, just because they get you a lot closer to how the computer actually works. This is doubly important if you are interested in writing high-performance code, e.g. game code. This importance is diminished if you are going to be mostly writing in scripting languages for a game engine that's got most of the basics already written, but nevertheless it may benefit in various ways that are pretty subtle. That said, things like Java and C# tend to be written in C and C++. C is for most practical purposes a subset of C++ (IIRC there are a very few tiny differences that almost no one ever runs into), and as such every C++ compiler is also considered a C compiler, so you may already have something of a head start there. Despite the fact that a lot of people try to trash C, it's about 40 years old and still going very strong, and many key open source and proprietary programs are written in it. Linux, Windows, Java, and many game engines are very noteworthy examples. However, in the case of assembly, I would caution you not to spend a huge amount of time memorizing specific system nuances. Because assembly is most often used these days for writing extremely efficiency-critical code, there is a temptation to over-emphasize how to build code that jives with processors very closely. This is not what you are there for in all likelihood; you're there to learn generalities. You may come back to assembly if you should ever end up writing code at that level (which is unlikely, but possible), but that time is not now. You are there to see exactly what the computer does, not to write the best possible code. C is another story, though, and I think it could be a lot more useful to you in the long run.

As per how C# works under the hood? Keep in mind that C# is probably WRITTEN in C or C++. It is probably going to be important to know how it functions under the hood. You can read technical documents that will give you some idea, but unless you've got at least some experience with, say, memory allocation, it's not going to make the same impact that it would on its own. Garbage collection, for instance, was added in order to make the job of writing programs easier, but it still ends up doing the same type of work as you have to do manually in some languages, such as C. As such, learning the basics of how it works in the first place will give you a far better idea of what's actually going on. Also note that in a lot of cases you won't be getting THAT close to the hardware in C. Things like system calls will end up going to the kernel, which takes care of some of the heavier lifting, as well as directly talking to the drivers. I may be biased since C was the language I REALLY cut my teeth on, but it's probably going to be a lot less painful and intimidating than you'd think.

Additionally, I would encourage you to consider looking into algorithms. In many ways this is the heart of computer science (which, at its core, is a branch of applied mathematics). However, this is one of the hardest areas of computer science, so you shouldn't be surprised if you struggle. That said, often the ability to at least APPLY these can differentiate developers markedly. You don't necessarily have to be the greatest innovator or algorithm designer to, for instance, know to use a quick sort instead of a bubble sort - except for the few situations wherein a bubble sort would in fact be better! (Note - bubble sort is typically good for lists with items numbering 20 or less if you want performance, otherwise stay away from it; quicksort approaches n log(n) efficiency, which is near the theoretical limit for generalized sorting algorithms). HOWEVER, this can be very difficult without the support of college faculty, or at least people who are REALLY good with it, and as such if you go this route be ready for some significant frustration if you want to go into it at depth. Knowledge of algorithms is really something that sets apart a lot of people who are self-educated vs. college educated, so this will benefit you greatly if you can get the hang of it, and even if you can't, the familiarity may serve you well. I suggested programming languages because this is one of the more fundamental computer science problems and links pretty well into general theory of how to process things, and it's a very classic problem for fairly obvious reasons, which ties in a LOT of other areas (where do you think those garbage collectors came from?).

As a final note, with game engines - aim to learn for fun, not as a living. People go into computer science thinking they're going to be the next Carmack. The vast majority of them are going to end up burning out in five years after being ground to a stub by a company, before being tossed into the trash heap and replaced with the hundred other guys who would crawl over their own mother to get in your job, only for the same thing to happen to them. I am not trying to discourage you, but the game industry is not a pretty place, and most CS people go in with the idea that this is where they want to go and will manage to impress everyone. Most will be very disappointed. Game programming as a hobby can still be very rewarding, though, and game programming skills can be reused in other sectors, so keep that in mind.

DISCLAIMER: It has been a LONG time since I've had experience with a lot of this, so don't take this as the unerring, absolute truth. Good luck!

1 10 Jan 2016 23:14 u/tribblepuncher in v/programming
Comment on: help me improve

While you can do all kinds of studying on your own, if you are so inclined, team programming on your own is pretty darn hard to do, basically by definition. I would suggest you consider contributing to an open source project to better learn how to function in such an environment. AFAIK a lot of them welcome contributors from many different skill levels. You should also look up software engineering in general - a lot of that is group and process management, especially involving the various methods of implementing the software development life cycle ("waterfall" model, "spiral" model, and agile methods come to mind, but it's been a long time since I've looked this stuff up). In fact, software engineering seems to be a lot of what you're looking for in general.

In terms of "advanced" you may want to investigate algorithms and languages (e.g. how a computer processes programming languages). This will be more difficult without support of faculty, but it is not impossible. Hardware optimization may be harder, though, since hardware is a moving target, so the techniques you learn now may be useless in a few years, and the generalized methods for teaching it are not exactly the highest priority items in a lot of cases. If you want to get a better idea of how the machine works, I would strongly recommend you consider assembly language - despite people decrying it as fit only for barbarians, it lets you see how the computer works in great detail, and that will probably help you out. Another thing would be to learn C, specifically so you can learn about things such as resource management (particularly memory). Languages such as C# try to hide a lot of complexity from you; this is both a good and bad thing. Good in that in most cases you don't have to worry about it, bad in that when something isn't working right you don't even know where to start, let alone how to address it, and you also know enough to know you may need to design or code around nasty gotchas.

1 10 Jan 2016 13:55 u/tribblepuncher in v/programming
Comment on: 2015's Most Popular Programming Language Was Good Old Java

This is quite ironic, since Oracle is busy trying to sue Google and kill or cripple one of the two things (Android) that made Java keep its popularity thus far.

4 10 Jan 2016 12:34 u/tribblepuncher in v/programming
Comment on: Starting a tech startup with C++

Because there are a lot of developers out there that think C and C++ (and, really, just about anything that compiles to machine code) is only fit for filthy, wretched barbarians, and scripts and interpreted languages are the One True Path for the enlightened sages of the future.

5 03 Jan 2016 12:26 u/tribblepuncher in v/programming
Comment on: The Website Obesity Crisis

Unfortunately, this has been a chronic trend with computers. As soon as new capabilities become available, those capabilities are exploited... but not in a good way. Programming is a particularly bad offender here - people write their code in interpreted languages using bloated abstraction layers and class libraries. The constant refrain: "Computers are powerful enough now that the user can handle it." Yes, they can.

Until everyone is doing it and you've got dozens of programs running, all designed with the idea that efficiency is not necessary due to the advances in computer capabilities. And all the sudden... your fancy new computer is running slower than an older computer, because it has to deal with this crap.

This is the web site version of that. Unfortunately I have no idea of what could be done to stop it, because the mentality seems to be pervasive. I know that in some ways this is more efficient for the companies, as burning off a lot of CPU cycles may be an acceptable tradeoff for rapid deployment of changes and more efficient use of developer time. Still, I can't help but think that somehow, sometime, this is all going to bite everyone in the ass hard somehow. And in the case of websites, it's even worse, because those extra megabytes can be crammed full of obnoxious ads that the host is paid for and your computer has to choke down and process, which does very little to help with swelling bandwidth and memory requirements (which does not even touch on the security problems of many of these ads).

1 03 Jan 2016 10:03 u/tribblepuncher in v/programming
Comment on: I Want to Start Programming. Where Should I Start?

First of all, do NOT start with a visual language. You want to start off with text. It isn't exciting, but it lets you learn a lot more about how the computer works.

To that end, I recommend you consider starting with BASIC or Python. Note that when I say BASIC, I do NOT mean Visual BASIC, I mean things along the lines of QBASIC, if you can find it. These languages are good for learning the basic programming concepts, such as what a variable is, looping, simple math, functions, and so forth.

From there, move on to C. C is a lot closer to the hardware, so it lets you know how things actually work. You will probably want to learn about pointers (but be ready for some suffering). Note that using Visual C or Visual C++ is OK for this, because while they use the term "Visual," that's more because it's of the same product line as Visual Basic. Visual Basic makes it much harder to do "just text" projects than Visual C++. Also note that, with very few exceptions and only a few language quirks, a C++ compiler also acts as a C compiler, because C++ is for the most part a superset of C. I think there are a few esoteric bits here and there that are different, but you are unlikely to run into them as a problem.

Follow that up with learning Java, and then JavaScript. I suggest them in this order because Java will help you learn about object oriented paradigms, and JavaScript is forgiving - too forgiving. In a lot of cases, having a language that is too forgiving can be useful, but when learning programming and wanting solid code, you can get what amounts to junk in there that does not produce what you want it to, and that can make serious problems for debugging it since the interpreter or compiler simply makes do with what it has rather than noting that something has been written badly.

Note that all of this is for learning programming - not practical use. The languages that I've listed are good for practical use. But practical programming (which is ideally done after you've learned it rather than hacking together a mess) will often use things that are not, for instance, strictly text. They also involve using external tools, frameworks, libraries, and other things specific to a particular language and/or platform, and this knowledge may be vital to a practical project but has nothing to do with underlying theory or a particular language's syntax. Visual Basic, as noted earlier, is an example of this - it lets you literally draw your program's interface. By doing this you can make quick and dirty programs. However, you can very easily dig yourself into a very deep hole by not knowing more fundamental aspects about the code. Other visual tools tend to suffer from this problem, e.g. the GUI builder in Netbeans for Java, which is even worse since Java was not designed to be a visual language in the same sense that Visual Basic was.

Be sure you learn more about how programming works before you get into graphics. This can be very easily a massive distraction - those who don't work with computers on a regular basis may view text-based interfaces as barbaric. They are not, and graphics and graphical interfaces can require a LOT of code. If you don't learn how to use them properly, you're going to dig yourself into a huge hole. Of particular note, you will probably want to learn about file I/O, streams, handles (in terms of concept and practice), threading and inter-process communication, memory allocation, and networking, off the top of my head. Also note that some languages will try to do this stuff for you. In most cases, it will at some point fail to do a good job, so you will need to know how to work with it. A good example of that is Java's memory allocation system, which can have a trivial program eat up multiple gigabytes of RAM if you're not careful.

Consider learning this on Linux. Consider learning Linux, for that matter. Linux is based on Unix, and Unix was in many ways designed by and for computer scientists. There is a reason why most colleges have been teaching computer science on Unix since the 1970s. There is also a distinct lack of visual languages, but many languages available, so that ought to provide some help in terms of direction. It can also help you to apply certain concepts and learn how they work and don't work. For instance, Linux has most I/O involve file I/O, which helps to underscore just what the flow of information is between devices, and how an abstract concept can be used in many different ways, often to link together two very different pieces of logic.

Be sure you learn HOW things work in each paradigm and language. It is possible in many languages to just repeat what you're doing in another language, but that's basically using a sledgehammer to cram a round peg into a square hole, at least when done across a non-trivial program. This can and does sort of work, but you will be hurting over the long run if you approach problems this way on a regular basis, especially since you will eliminate the language's strong points, emphasize its weak points, and often bring the weak points of the original language/paradigm in as well.

Do not get hung up too heavily on specific languages. There are many, many different programming languages, but the fundamental concepts can be more valuable than learning most specific languages.

Finally, one thing to note is that computer languages are NOT like human languages. A computer is a fickle beast which, at its lowest level, reacts only to specific instructions and will follow them mindlessly. It is up to you to be aware of how to talk to the computer, how to tell it what you want it to do, and how the logic fits together, and this is often no simple feat.

DISCLAIMER: Just my opinion.

0 24 Dec 2015 03:12 u/tribblepuncher in v/programming
Comment on: Why does the Java programming language suck so bad?

I would use Wikipedia as the starting point, with their list of programming paradigms, which seems decent on a cursory examination:

https://en.wikipedia.org/wiki/Comparison_of_programming_paradigms

However, you must keep in mind a few things.

First of all, there are a lot of lists like this, and most people have one language type they adhere to and may view the other language types as fit only for barbarians (and in some cases the authors are complete assholes about it, which makes me wonder if they have any idea of what they're really talking about, even if they make good points). Most, if not all, paradigms have a place, even if the author is so obsessed with their own concept of "the right thing to do" for programming that they're willfully blind to this.

Second, a lot of languages are, to some extent or another, hybridized. Sometimes this is done for efficiency, sometimes it's done because the language designers needed or just wanted it, sometimes it's there because the language was designed by committee. Just about every one I've run into has some dominating paradigm, but the others mixed in there may make sure that the dominance of that main paradigm may not be by a whole lot. So the specific languages in question the implementation of the paradigms are important, too.

Third, a lot of languages have many dialects. Some are particularly bad with this, wherein everything from modern object-oriented code to TI-82 code to line-numbered GOTO-infested code from 1977 are all considered some dialect of BASIC. So the individual implementation of the language is also important, and if there is a generally accepted standard (e.g. C99), that is also important.

Fourth, some problem domains just blatantly encourage a particular paradigm. For instance, if for some reason you want very compact code, you may find yourself going to a pure functional language, even though there may be other drawbacks such as performance and stack space penalties. Others may be more oriented towards, say, being able to be changed quickly, as well as established norms, as you'll often find with the scripting languages in web development projects. So even if a paradigm is repugnant for you in terms of regular projects 99% of the time, there is probably a 1% of the time that somewhere, nothing else will fit quite right for one reason or another, no matter how much you may hate it.

So in short this is not a simple problem to approach. Your best bet is likely to learn about the generalities and pick up some familiarity with several different language types (including assembler). Then you'll be better equipped to evaluate what a language actually is, what it's good for, how to write code that uses the paradigm rather than trying to bash it into the syntax so it fits, and have an overall idea of how it does what it's trying to do. Note also that you do not need to LEARN dozens of languages to pick up this much, although a lot of people learn a lot of languages - and understanding this stuff can go a long way to rapidly picking up enough of a language's syntax to have a minimum level of competency in a short period of time.

1 24 Nov 2015 03:11 u/tribblepuncher in v/programming
Comment on: Why does the Java programming language suck so bad?

I'm rather skeptical of a lot of things that claim that "XYZ language sucks so bad." Truth be told, how bad something sucks is often going to come down to someone's specific opinion, and in some cases, specific beefs. If someone thinks that using direct machine language is a disgusting, barbaric approach, they're going to think that writing code in C and/or non-managed C++ should be a capital offense with no appeal. If someone thinks that interpreted languages are basically a pile of bloated abstraction layers fit only for the lazy or grossly ignorant people who shouldn't program in the first place, they're going to think the same thing for managed systems and scripting languages. People who value mathematical precision will scoff at procedural languages, fans of speed and simplicity will say that functional languages are a pile of brain-bending gibberish. This is to the extent that some computer science professors thoroughly enmired in theory think pretty much all languages that people actually use are abominations, and then wonder why nobody takes them seriously.

No language is going to be perfect, and neither is the software that makes it manifest (e.g. compilers or interpreters). If this eludes you, you are probably going to be in for a rather rude dose of reality at some point.

...except for Prolog, that shit's just plain nasty.

8 20 Nov 2015 06:13 u/tribblepuncher in v/programming
Comment on: Why does the Java programming language suck so bad?

It's expensive and risky.

There is a very real truth to the idea of "if it isn't broken, don't fix it." There is a reason that companies still pay for big, expensive mainframes, and one of those reasons is, in many cases, near-perfect backwards compatibility as far back as the 1960s.

3 20 Nov 2015 06:09 u/tribblepuncher in v/programming
Comment on: Developers Who Can Build Things from Scratch

In almost all cases you have to learn by doing with programming. Just offloading the work to third party libraries doesn't cut it. Sometimes there is merit in reimplimenting things as basic as libc (in part, anyway). Plus, knowing how those libraries work helps a lot at times - especially if you might need customization. Additionally some libraries require a lot of infrastructure that is more trouble than it's worth for the project at hand. Plus sometimes it's faster to just write a few procedures yourself instead of learning the quirks of a whole new library.

And of course there's always the possibility that the library sucks or is inefficient, or in some cases is outright unsuited to the purpose (e.g. a highly secured program, especially where you can't trust the library, or one where extreme compactness is necessary and a static executable is desired). Additionally libraries can add on a lot of bloat - these days many programmers have an attitude that "the computer has enough power/memory to do XYZ on an abstraction level, no one will notice!" Unfortunately when you have 400 things written on the assumption the computer can do something under a ton of abstraction, you suddenly find out that while your computing power was significant, it was not, in fact, limitless.

So there are some very good reasons for implementing your own stuff, whether or not it's from a library, both for learning and for building new software.

1 17 Aug 2015 04:33 u/tribblepuncher in v/programming
Comment on: What are some programming jargon everyone should be aware of?

I think that software development training really needs to start including sections on when certain "proper" behaviors are taken way too far, and the detrimental effects they bring with them.

1 27 Jul 2015 04:11 u/tribblepuncher in v/programming
Comment on: What are some programming jargon everyone should be aware of?

Overload - means that two or more subroutines are called by the same apparent words/symbols, but they are differentiated by context (e.g. one is fed integers, another text, but they both have the same name).

4 26 Jul 2015 20:18 u/tribblepuncher in v/programming
Comment on: Where should I migrate my projects to (from Github)?

Some Python related project (I think Django) changed the technical standard terms "master/slave" after someone got offended.

Not the first time; I recall that there was an incident in Orange County, or possibly the city of Los Angeles itself, where people got their noses out of joint over the designation of master/slave drives. I think this was when they were still using old-style parallel IDE drives, maybe the early-mid 2000's or so. I am thoroughly unsurprised that others would be similarly thin-skinned and/or emotionally frail.

11 23 Jul 2015 15:26 u/tribblepuncher in v/programming
Comment on: [Q] What do you think of Clojure and other LISP programming languages?

DISCLAIMER: Haven't touched functional languages in... a REALLY long time.

I think you're running into what many run into in regards to functional languages.

Procedural language programs are essentially lists. Object-oriented language programs are lists intermingled with data in a much more modular form. This is relatively intuitive, IMO, to the human mind.

Functional languages, however, are a hybrid between mathematical equations and trees. Fundamentally different mindset, and where it gets a lot of its power. I've found that in general, Lisp that is well-written tends to be a lot shorter than the equivalent procedural code, but it can be MUCH more difficult to understand.

I've personally found that they make more sense after you've gone through writing (or at least analyzing) a parser or interpreter for functional languages. They can be surprisingly short - some have a back-end that can fit on a single page! Unfortunately, I can't point you at one because it's been so long, but it seems to me that if you can look at it more like a tree, you can at least get something of a better grasp of the concepts.

That said, IMO, functional languages tend to be very counter-intuitive for a lot of people. They're GREAT for certain applications (genetic programming comes to mind readily), and they help with teaching concepts, but there are reasons why there are nowhere near as many real-world applications written in these languages, rather than procedural or object-oriented language programs. You REALLY have to twist your head around it, and for all its power, it is best for certain problem domains, and in many others it can get in the way more than it helps.

1 13 Jun 2015 13:29 u/tribblepuncher in v/programming
  • ‹‹‹
  • ‹‹
  • ‹
  • 1
  • ›
  • ››
  • ›››

archive has 9,592 posts and 65,719 comments. source code.