What constitutes 'coding'?

11    15 Feb 2016 09:10 by u/psioniq

A question I've been asking myself for a long time now.

When is a language considered 'coding', and when is it considered 'scripting'.

Some are obvious - like LUA. But what about C++ vs C# for example - is C# considered scripting? What about HTML?

Would love to read your thoughts on the subject.

24 comments

16

Compiled language == coding. Interpreted language == scripting. C# is a compiled language. HTML is a markup language that uses a parser so I would consider it a third animal more akin to "writing".

2

That was a nice and concise answer.

Thanks a lot for chiming in.

1

There's some complications in his answer though. For instance C# can be used for scripting or for coding which is a really cool feature. You can compile standalone executables (which are actually in an intermediary interpreted language format which is later translated to machine code at run time) but you can also write code that takes in arbitrary C# code and interprets it at runtime! See for instance this C# interactive shell.

This question would have been a whole heck of a lot easier to answer in pre-java/just-in-time-compilation times!

2

So whats Java? Its compiled to bytecode thats intrepreted.

Your definition is so narrow.

No one says theyre 'scripting' when writing javascript. People also say they 'code' html.

OP, theres a book, 'code complete', you should read.

0

Compiled language that runs on a virtual machine. And in a few cases, the virtual machine was actually made to work on real CPUs. So, coding.

That said, there is a lot of grey area in many areas, particularly the subtle ones, such as Perl, which, IIRC, compiles and then executes from memory.

0

Perl 'compiles' to bytecode in a very similar way to Java. It's still interpreted.

0

I do not know about Java but I think it is the same as C#.

C# is compiled into IL (some kind of virtual assembly). The original source code does not exist in this IL. This IL is then converted to processor specific machine code on the go.

The distributed assemblies do not contain the original source code anymore, but in the case of scripting it has to contain the original sources.

0

Yep.

In fact, Java is in fact a real instruction set that has actually been implemented in silicon.

https://en.wikipedia.org/wiki/Java_processor

There are several, and I believe there were quite a few more about 10-15 years ago when Java was really getting traction. Java is, essentially, running in an emulator (hence the acronym, Java Virtual Machine). In the case of some things, such as Swing, I believe there are specific interface hooks and the like; not quite sure how that translates out to actual processors that use the Java bytecode as their assembly language, although I imagine most of them aren't using Swing in any event so it's probably a fairly irrelevant question. This also brings up an important question - does an instruction set become interpreted if it is emulated, even if it already exists in silicon? If so, then probably every major architecture on the planet is interpreted at that point. Hence the "grey areas" I mentioned.

0

People also say they 'code' html.

People can say whatever they want, that doesn't mean they're correct.

0

I think this might be at the root of my original question.

I've met quite a few people who used that exact phrase to describe what they were doing - and I always felt it was a little wrong.

Sure, if you're doing MVC along side the html - you are definitely in 'programming land'. But I've just always seen pure html as a form of interactive DTP, and not 'real' programming - since, most of the time, you are just writing, well, markup. There is usually no real logic involved.

I'm not bashing html - I've done my fair share. I think it's just one of those little things, where it irks me when someone calls themselves a programmer, and all they've ever done is html.

0

UnrealScript was very much a scripting language but was compiled. Python is used to write serious server software and is interpreted.

Some languages, like Lua, C, and JavaScript, have both compiling and interpreting implementations, but the person writing the software (i.e. doing the “coding” or the “scripting” in question) doesn't need to know which implementation their code will run on, which means that by your definition, the question of whether they were coding or scripting only gets resolved when their software is run.

I work as a developer and in my experience “scripting” refers to writing quick scripts to automate existing software, like, say, adding new menu items to Photoshop. “Scripts” can be quick one-file Python programs to do simple things on a server but writing them isn't often called ”scripting”. “Coding” is a catch-all term that covers pretty much all cases, except, as you mentioned, markup languages, which is usually just called “writing some HTML”.

7

"Scripting" usually refers to automating one or more existing programs with a scripting language. Performance, architecture, security, correctness, etc. are not really a concern so much as getting the job done.

"Programming" or "software development/engineering" will take those other things into account because the goal is to make a robust application or library.

"Coding" is what lay-people call the other two things. Try to avoid these people at all costs when learning because they usually have terrible advice, practices, or understanding of how computer fundamentally work.

3

This is the right answer. (Source: am developer)

0

Can confirm. Source: am developer.

0

Can develop. Source: am confirm

0

You are right about this. I never use the word "coding", but "programming" in C#.

And scripting I use in small hacks, like compile scripts, or stuff to move files. These scripts are intended for other people to be modified easily that are not that deep into programming.

0

That's a pretty good distinction too.

And I'll try to avoid myself at all cost! (I've been using the term 'coding' to cover everything for years - because I couldn't figure out what constituted what) :)

0

I'd try a different approach from most of the people. Making a decision based on languages, compilation procedures is tricky, with all JIT, bytecode compilers, wrappers, and so on.

I'd make a decision based on purpose and development procedure. If the software that you're writing is a big-thing, which will be either sold to customers, deployed somewhere or published as an open source project, then you are definitely coding.

If you are writing a small utility which will be used by your friends, workmates, only by you and which is written just for one purpose, then it's scripting.

Of course, the definition leaves a lot of wiggle room, but I think gives a better distinction.

0

You could say:

  • Coding, when the end result does not contain the original source code.
  • Scripting, when the end result requires the source code.

Coding languages tends to have very scalable projects, with scripting you pretty soon reach the limitations of that language pretty fast.

0

So hang on are you saying C# isn't a programming language??

0

Nope. Was just using C++ and C# as an example of where, some, might say one is 'coding' and the other is 'scripting'.

I use C# daily - and I do, kind of, consider myself a 'coder' - or, as others have stated; programmer/programming - which might be the better term.

0

ctrl+c and ctrl+v

That is all there is, and ever will be.

-1

Interpreted language = scripting
Pretty much everything else isn't so black and white