Comment on: No, it is not a compiler error. It is never a compiler error.
So far as I know it's always been a part of the C standard - it certainly has been for decades at least. The latest standard is here and the specification is in 6.9.1.12:
12 If the } that terminates a function is reached, and the value of the function call is used by the caller, the behavior is undefined.
Comment on: No, it is not a compiler error. It is never a compiler error.
Not actually a compiler error! That's valid C code per the standard, including the weirdness. Functions with return types don't actually need to return anything. What is returned is undefined and would probably be different from platform to platform.
Comment on: quiz. write a function recurseList(f,x,n) that returst a list, [f(x), f(f(x)), ...], length n.
Not familiar with Perl but that looks to me like your recurseList function is just mapping the result of a function onto a list. So if you used a doubler function it would return (for n = 5): 2, 4, 6, 8, 10 - which is incorrect.
Comment on: quiz. write a function recurseList(f,x,n) that returst a list, [f(x), f(f(x)), ...], length n.
List<T> RecurseList<T>(Func<T, T> f, T x, int n)
{
List<T> result = new List<T>();
for (int i = 0; i < n; i++)
result.Add(i == 0 ? f(x) : f(result[i - 1]));
return result;
}
Sample usage:
> Func<int, int> doubler = (input) => input * 2;
> doubler(2)
4
> RecurseList(doubler, 1, 10)
List<int>(10) { 2, 4, 8, 16, 32, 64, 128, 256, 512, 1024 }
Comment on: Why Elon Musk is Wrong about AI
I think the problem with AI is mostly everybody thinks that they are qualified to have an opinion on it, even when they know absolutely nothing about the field whatsoever. AI issues do not suddenly emerge when we have Hollywood AI - there already are major issues, and they're only going to get more substantive. The recent achievement of computers becoming the strongest players in the world at Go is very relevant, even though without any knowledge of what's going on AI it would not seem it. A few major points:
-
The developers have no clue how it works. This is probably the most important point. This isn't due to some magic or it taking on sentience or whatever, but because it utilized deep learning and recursive training (playing against itself to improve). So while the developers can explain how they taught it how to 'learn', they could never explain why it chose to play what it did on move 24, for instance. Its play is also so revolutionary that Google is publishing a highly anticipated book (among Go enthusiasts) filled with nothing but the machine playing against itself. In other words it did not simply apply human knowledge of the game with super-human capability, but rather it created new knowledge and ideas. And nobody can explain exactly how it did it. We're now applying deep learning to absolutely everything from profiling for employment, to military weaponry. What could go wrong?
-
It happened much faster than most anybody was expecting. This is a recurring theme in AI systems today. Most 'experts' felt we were decades away from having computers become dominant at Go, if it ever happened. I'll hit on why they thought this in a minute, but the point here is that technological progress in accelerating. A good way of figuring out where things are going to be in 'x' years when you're speaking of accelerating technology is that most people should think it sounds crazy. We're wired for linear progress - accelerating progress creates absurd sounding things like saying that in 40 years you'd go from computers the size of buildings, costing millions of dollars, to machines that fit in the palm of your hand, cost tens of dollars, and run millions of times faster. Crazy, right?
-
Go requires skills that computers are awful at and humans are excellent at. It is a strategic and "slow" (in the strategic and literal sense) game that emphasizes pretty much everything humans are great at: intuition, planning, 'understanding', and broad strategic goals. It's also practically infinite in depth. In Go each position has an average of about 250 possible moves. The average game lasts about 150 moves. Think about that for a minute. What that means is just 10 moves in Go (that would be 10 moves for each side) would result in 250^20 or 10^48 possible games. The fastest supercomputer in the world can currently do 18 quadrillion floating point operations a second. Let's just say that evaluating a Go position is no more complex than a simple floating point operation. It would take this computer 1,760,000,000,000,000,000,000,000 years to evaluate all those 10 moves. That number is in turn about 127,300,000,000,000 times longer than the age of the universe. Exponents are complete mind fucks are they not? And of course just going 10 moves is a -really- terrible Go program. It's fun to mock experts, but in this case - those experts thinking Go was not getting solved anytime in the foreseeable future had every logical argument in their favor.
Comment on: OpenAI Universe provides open source method to develop image recognition driving agents in GTA V. Pretrained agent available. [Video]
Yeah, strange! Looks like they just removed the page for now. No idea what's up there.
Here is a web archive of the page: https://web.archive.org/web/20170112124746/https://openai.com/blog/GTA-V-plus-Universe/
Here is a video with a third person cam of the AI system driving.
This is a first person view of the AI system driving.
Comment on: OpenAI Universe provides open source method to develop image recognition driving agents in GTA V. Pretrained agent available. [Video]
Really! The thing that surprised me most is that the pretrained AI is as competent as it is after training with just 600k images of the game's native AI driving.
OpenAI Universe provides open source method to develop image recognition driving agents in GTA V. Pretrained agent available. [Video]
2 2 comments 16 Jan 2017 18:09 u/rwbj (..) in v/programmingComment on: Can video game hacks ever be stoped?
It is, given current technology, relatively trivial to rapidly identify the 'class' of objects within a scene. Image recognition has been one of the biggest beneficiaries of deep reinforcement learning. The current 'turing tests' for image recognition have gotten to the point that competitors are not expected to just identify features/objects of an image but describe implicit relationships - the cat just jumped from the table on the right to the chair on the left.
The reason frame continuity is important is because you can prioritize a scene into various regions. So for instance if you know for a high certainty that a certain region of the frame is a wall with no actors - then you an lower that regions processing priority for future scene analyses. And similarly if there's some sort of dynamic object that can't be carefully identified you can prioritize it. And these prioritizations carry over from one scene to another. For instance once you do find a player it will be trivial to 'lock' onto him every frame until he is completely obfuscated. "Increasing latency" makes no sense. From a player's perspective it would be completely seamless.
Comment on: Can video game hacks ever be stoped?
I think massive AI is a substantial exaggeration. It doesn't even have to be particularly fast because of frame continuity - most of what's there in one frame will be there in the next. It could have a simple probabilistic detection that gets refined each frame.
Cheating this system would certainly be far easier than creating it. Latency is one challenge like you mentioned -we need at most 17ms latency for a 60FPS game- but that might not even be the biggest problem! Think about the bandwidth involved here. Movies are compressed like crazy. A game running at 60FPS 1080p without compression would require 1920*1080460 bytes of data per second - that's half a gigabyte per second. So the servers obviously need to not only render the data for every single player but then also create some sort of per-user relational compression before sending it out which would involve contrasting two complete frame buffers. And then you need redundancies for when things inevitably and constantly break, without meaningfully interrupting the players' experience in any way.
I don't see any clear way quantum computing could help here either. Quantum computing isn't a "just like regular computers but a million times faster." The technology behind quantum computers is useful for a very small subset of all problems. For They'll definitely do things like break current forms of encryption since factorization falls into this subset of problems, but quantum computing is not the future of computing that some are making it out to be.
Comment on: [Q] what's the best repo platform for a simple project?
This seems like it would be much better suited to a wiki than a full-on source control system. Google for 'free wiki software' and you'll get plenty to choose from. Keeping things as simple as [optimally] possible will save you a lot of grief.
Comment on: The Lost Art of C Structure Packing
Is there any compiler that doesn't take care of this? I wrote a C compiler myself (for a compiler class years ago) and one of the specs there was optimized type alignment which is pretty straight forward on a compiler level. I think knowledge of this would be relatively important for somebody writing a low level compiler, but doing this for software in general seems like it's going to be wasting lots of time and best and decreased performance at worst. And of course you need to rethink all your assumptions when you want to deploy on a different architecture. Nasty business.
Something for instance I was surprised to discover is that in some cases esoteric bit hackery for various mathematical operations ends up with a worse performance than doing things 'naively'. I have no clue why or how this is but I assume it's the compiler taking advantage of those 3 and 4 letter acronyms for hardware level optimizations that I've lost track of after SIMD that might do better at optimizing 'normal' code, yet struggle with 'optimized' code.
Comment on: So, the U.S. Government just open-sourced a ton of their software
The second file I looked at was this. Near 0 documentation - I counted about 8 lines (mostly noninformative like a comment stating that a global variable is a global variable) for 700 lines of code, magic numbers all over the place, incredible variable names like lgid, gid, cgid, c, ectx, dctx, p, prog, and e. I haven't checked out the actual code logic yet, but yeesh. I started with the Department of Energy since that's where presumably some of the more mission critical code would come from.
I always wondered how the government could manage to spend millions if not billions of dollars on relatively simple projects and still have them blow up.
Comment on: Prime numbers.
Take a number, x, and imagine this number is composite.
Since it's composite there are two other numbers, n and m, that multiply together to produce x. What we want to know is what is the absolutely largest values, relative to x, that n and m can be. The reason we want to know the largest value is because that's as far as we need to check for divisibility. If the biggest they can be is 5, then we only need to run our for loop up to 5. And the reason we want them both to be large is because in cases of large_number * small_number we only need to check up to small number to get a hit. We want the worst case scenario which is where n and m are their max size. And the worst case scenario is where n and m are the same number so n*n = x. So what does n equal? sqrt(x).
Comment on: What programming language SHOULDN'T you learn?
Lol, the internet where you can never be wrong.
Yes there are. There are dozens of CLI languages and many are not OO. This is a non-comprehensive list of some including various functional languages like Lisp, specialized languages like Prolog, and so on.
Comment on: What programming language SHOULDN'T you learn?
Java is fine but outdated. You can do more in C# and you can generally do it far cleaner. C# played catchup for the early part of its life, but for years now Java has been playing catchup to C#. And Visual Studio is more or less built around C# and vice versa. Java, last I was using it, is severely lacking in the IDE department. As just one small thing building a GUI in Java vs one in C# is just night and day. C# also plays much nicer for those instances when you need to break out of idiomatic C# and go native, execute unsafe code, etc. Java is basically idiomatic or GTFO.
And something not at all directly related to the language, but the original style guidelines for Java suggested the god awful brackets on same line as statement/declaration style. For the life of me I cannot figure out why a single person would ever like this. You save a line of white space, which has no tangible value, in exchange for a substantial penalty to readability which has a massive value. Why? WHY?? This legacy still carries through to many projects and it makes my eyes bleed.
Comment on: What programming language SHOULDN'T you learn?
.NET is a framework, not a language. It supports dozens of different languages. I assume you meant C#.
Comment on: A C++ style bug in C# or I can't believe this is not a warning
Yeah I was also looking into that before just for precise refactoring of namespaces, which visual studio inexplicably does not support. Only problem is I just abhor the software as a service model.
Comment on: Will being a programmer become a near minimum wage occupation?
Being a highly skilled coder you can start your own business in a practically infinite number of fields, and with almost no capital requirements on top of that. So no I think programming will, for the long foreseeable future, be one of the most valuable and useful skills. However, the learn coding at a 3 month bootcamp and then go pump out some javascript programmers will certainly fall by the wayside. Programming is a skill, much like a sport. There is a wide spectrum from terrible to absurdly skilled programmers on a bell curve that's far more heavily weighted towards terrible. In today's society most of the people running businesses have no clue about programming and so we're still living in a an analog where sports teams just hired the cheapest possible athletes because hey all athletes are more or less the same.. right?
The desirable companies tend to be run by programmers - places like SpaceX and Google. And these companies evaluate skill, hire very selectively, and pay accordingly. I think in the future programming jobs will likely be harder to obtain though. There's not only the competition aspect but I think more companies are getting burnt by hiring shitty coders and starting to learn from this. So pair greater competition with more selective employers.
I also would not discount the skills of H1B hires. Technically companies have to prove that they have a job which cannot be performed by a citizen to hire an H1B. And while there are some economic incentives like no social security pay for H1Bs, in general they receive very good wages even by American standards. It's not like Abu the coder earns $20k in a job an American would expect $60k. To the point, I graduated a top 10 university in computer science and the vast majority of my peers were terrible programmers. There was almost a sort of pride in it as well - "I prefer theory over practice." That's not a good culture. If other countries have a greater focus on merit and students more focused on application then it's very possible they're overcoming even the best of our education system.
A C++ style bug in C# or I can't believe this is not a warning
1 0 comments 21 Sep 2016 09:22 u/rwbj (self.programming) in v/programmingComment on: Basics of AI?
This is a very complex question that is heavily dependent upon the game you're talking about. It also depends upon the system you're using for AI. If I were to try to give a general answer it would be to intentionally cripple your heuristic evaluator. So for instance imagine we have a game where your AI's decision tree is based on a min-max algorithm and your heuristic system evaluates each position from a 1 (losing) to 10 (winning). A simple way to cripple this heuristic system would be to change your heuristic evaluator's final value to be at most 7. So now a 7 move and a 10 move become effectively identical to the heuristic and consequently instead of always playing the best move the AI will play anything with a 7+ evaluation with an equal probability.
Comment on: Looking to learn Win32, any tips would be appreciated
C is a subset of C++. You're fine there.
But what exactly do you want to accomplish? Win32 is mostly a tool and one who's functionality is now taken care of automatically in most modern IDEs. I programmed in Win32 for years and couldn't even begin to write the basic code for just having a simple window on screen. It's full of strange conventions, variable names like lpfnWndProc, and borderline arbitrary typedefs of typedefs of typedefs like WPARAM and LPARAM (don't you dare mix up the purpose of those two highly descriptive type names either!) enough to leave you having no idea the types you're dealing with. And in fact in many cases it's okay to have no idea as there will be parameters and other things that literally are just no longer used.
The entire API dates back to 16bit days. Instead of rewriting it as technology changed they just kept patching and adding onto it which in part is why it's so poorly designed. I think it could still be a good exercise in learning how an event driven messaging system works under the hood, but the entire API is a pitfall. Even when you know what you're doing you'll be spending 80% of your time on MSDN and 20% writing code. I would recommend free online tutorials over books in this case. There's always lots of little gotchas in things like this when developing direct Win32 on modern OS's on modern hardware. Types change, those variables that are no longer used were used at one time, and so on. Good luck!
Comment on: C or C++: Which is the language you prefer?
I'd never use the word advanced since it's going to bring out the "strongly opinionated fans" of one language or the other. I ordered them in terms of "What, why can't I [do thing from other language]?" You're never going to find yourself missing a feature from assembly in C. You're never going to find yourself missing a feature from C in C++. And you're [almost] never going to find yourself missing a feature from C++ in C#. There are a couple of exceptions there like neat things you can do with templates such as template meta programming, and things that are nasty as like multiple inheritance, but in general it's true. C# was designed to take the best of a variety of languages including C++, Smalltalk, and Java and create a modern language featuring all the strengths of those languages while minimizing their individual weaknesses. It's for this reason that you'll find updates to the C++ standard regularly borrowing things that are already present in C#. It's also for this reason that having a very good understanding of what's being abstracted away, like memory management, is absolutely crucial.
Comment on: C or C++: Which is the language you prefer?
C is fundamentally a subset of C++. If you simply restrict yourself to a subset of capability, you can treat C++ exactly like C. C# vs C++ is where fundamental differences start to come into play.
If you're consider which language to learn I'd always recommend going from the bottom-up: assembly, C, C++, C#. This is not commonly recommended since the lower level languages are quite challenging to wrap your head around for a new developer, but if you can manage it you'll achieve a far greater understanding of niceties each language gifts you with and their underlying costs that can allow you to develop far more robust and performant programs than somebody who started at the top.
Useful Visual Studio Hotkeys/Tricks
1 0 comments 07 Sep 2016 04:00 u/rwbj (self.programming) in v/programmingComment on: Can Wikipedia Views Predict Market Movements? My first R Programming Project.
Do you know that as of 2013 upwards of 75% of all stock trades are being formulated, priced, and carried out by AI?
Ironically, what humans think is no longer as relevant as what AIs think we'll think. Although at this point of the metagame maybe that's not even true anymore since I'm sure the AIs are tuned to optimize performance against market conditions which would be primarily other AI, further reducing the relevance of human behavior.
I do think the fundamental idea is neat though. I wonder if there's a correlation in polling numbers and Wiki topic views though.
Comment on: Stackoverflow needs to be circumvented. rant + ramble
I'm not really seeing your complaint. Stock Overflow is a resource and the question is whether or not it does a good job at that. As I think most any developer could attest to, the answer is yes. I don't really care what high karma, point, or whatever they call their little numbers people are doing. I do know at least some of the very high karma users are very deserving of it. If you program in C#, there's a very high chance Jon Skeet has answered at least one of your questions and he has about a billion magical points, so good for him. If companies are genuinely hiring people because of a SO score without having the intelligence to verify that their content is meritorious then they are probably the last people you'd want to work for anyhow.
"Can we please get rid of the brain-damaged stupid networking comment syntax style, PLEASE?" Linus Torvalds rants against ugly commenting styles
29 22 comments 12 Jul 2016 17:26 u/rwbj (..) in v/programmingComment on: Agile, Unit tests and rapid release cycle is pure evil.
The problem is that that's incredibly myopic. It's obviously not just about elegance but about functionality. It's not a matter of having a shiny underlying codebase, but robust functionality under that shiny UI layer. People do care about these things but it's more of a longrun issue. We might need to adjust the saying to fool me once.. twice.. 800 times.. shame on you, fool me 801 times - shame on me. I think Windows 10 is a perfect example here. Microsoft began releasing operating systems purely as an iterative means of revenue generation with some major shiny-layer-changes but less and less difference under the hood. By the time they got to Windows 10, they literally could not even get people to install it when it was digitally delivered and free.
And we can see this in nearly every industry. I think the video game industry is a great example here since games have gone from an art to a factory-line level production of focus group iterated mush. The biggest console today is the PS4. It's sold about 40 million units. The PS2, more than a decade earlier, had sold about 50 million units at this time in its life to a much smaller population, and the relative decline there is rapidly increasing. The population part is even more telling. Our population in the US alone since 2000 has increased by nearly 40,000,000 or about 13%. That rapid population growth should be reflected in correspondingly record sales figures for everything, even if they were just breaking even. But we're not seeing that - the case with video games is typical where we're actually seeing many industries in decline. Account for population growth and they're in precipitous decline.
The reason this annoys me so much is because management are supposed to be the longview people. But shareholder syndrome has destroyed this. Shareholders don't care about companies - they care about profit, short term profit. And this interest filters immediately onto the executives and down to management which is completely bastardizing so many industries into timebombs of failure. But immense barriers to competition means even in their decline it's difficult to compete against these sort of companies meaning we see ubiquitous shit everywhere even as it leads to the decline of the peddlers.
Comment on: TIL: In a numeric system of base 'x', 10 is always equal to 'x'. Binary 10 = 2, Decimal 10 = 10, Hex 10 = 16.
Looks good!
Comment on: TIL: In a numeric system of base 'x', 10 is always equal to 'x'. Binary 10 = 2, Decimal 10 = 10, Hex 10 = 16.
I was just thinking something like this:
Jesse Howell on Twitter: "@notch @kkearns Ah... I get it now. I like Base 10 (binary) and base 10 (hexidecimal), but I use base 10 (decimal) most often."
Comment on: TIL: In a numeric system of base 'x', 10 is always equal to 'x'. Binary 10 = 2, Decimal 10 = 10, Hex 10 = 16.
Hahah, you've gotta refine the question a bit. The answer as posed is infinite as you can have arbitrarily small fractional bases. For a integral base you can give another cheat answer. Base 1 with a symbol of 0 such that 51 = 00000. An integral > 1 base is an interesting and ostensibly tricky one though!
Comment on: TIL: In a numeric system of base 'x', 10 is always equal to 'x'. Binary 10 = 2, Decimal 10 = 10, Hex 10 = 16.
Nice bot. Might be better to use the standard quote system with '>' instead of the code block since there's no word wrap in code blocks.
TIL: In a numeric system of base 'x', 10 is always equal to 'x'. Binary 10 = 2, Decimal 10 = 10, Hex 10 = 16.
13 18 comments 07 Jul 2016 06:24 u/rwbj (self.programming) in v/programmingComment on: Trigger input event automatically on value changed with code
There are no "browser quirks" here. The jquery code is functioning as designed and intended.
What I was initially pointing out was just the god awful syntax of javascript which becomes rapidly self obfuscating on projects any reasonably degree of complexity. But we could go beyond that to hit on what you're saying. The interaction between javascript, HTML, and the user is plainly terrible, but there is literally 0 reason for this to be the case. Your comment regarding my code doesn't even make any sense. There is 0 relevance to the code I'm showing whether it's executed client, server, or some mixture in between, at least not in an app.
And this is comparing apples to apples. HTML and javascript were never designed for what they do today, but continue to exist largely because they've become so ingrained that moving on from them would be difficult. Imagine instead we used a standard basic (limited instruction set) byte code in a natively stateful system. You could improve security, performance, flexibility, developer productivity, and more overnight by orders of magnitudes. The tie-in attempts to do this such as java-apps have fared poorly not only since they were again hamstrung by being tie-ins instead of replacements, but also because they were unilateral efforts. That not only means relatively poor implementations as one company was left to release virtual machines for every single enduser platform, but also limited receptivity. Have Microsoft, Google, Firefox, and the other big players work to develop the standard and leave implementation to browsers and perhaps even more ideally - operating systems.
I mean doesn't it ever occur to anybody else that the current state of HTML+scripts is shockingly reminiscent of early programming languages like COBOL? It was designed by people who tried to solve a lot of problems but could never have imagined what the future would hold. And at first the solution was to keep trying to tack more things onto it, but in the end a complete reboot often comes with a short term productivity loss but a longterm exponential gains, and I think our web interface is long due a reboot.
Comment on: Trigger input event automatically on value changed with code
I actually offered a more general solution. If you just want to call the callback when the object changes then you can do it in one line!
someTextBox.TextChanged += (sender, eventArguments) => DoTextChangedCallback();
The syntax isn't as clear but TextChanged is an event that's fired when the text is changed which you are subscribing to. You can hook anything into that. When it calls it it sends a sender (in this case the textBox that was updated) and whatever eventArguments are appropriate. The => is just the syntax indicating a lambda so it's an inline way of dynamically generating a nameless version of the method:
private void MyTextChangedCallback( object sender, EventArgs e )
{
DoTextChangedCallback();
}
And that method is now called each time the text changes. So we've now achieved an object specific callback in one line of code!
Comment on: Trigger input event automatically on value changed with code
0 19 Jun 2016 13:20 u/rwbj in v/programmingComment on: Trigger input event automatically on value changed with code
Definitely will. It's C# like CyrxCore2k said. I have no love for Microsoft's actions nowadays but it is by far the most elegant and clean language I've used. So for instance one neat application of the above, properties, is for config files. I wrote a program to parse my config file and write properties for each config setting. And it also sets a callback to a config file update method each time the value is changed. So now using config file variables and keeping the config file synchronized is a matter of simply doing "ConfigFile.varName = 1234;" and it all just works. Adding a new variable is nothing but running the config file program and having it auto build the source file which is a total of two clicks. Another cool feature is partial classes. That enables you to spread the implementation of a class around multiple files via a simple syntax "public partial class ClassName." So each time I run the config builder, it can safely overwrite the entire auto-generated source file as it's a partial implementation that includes only auto-generated code. Just beautiful and productivity+++++++.
More neat things about properties. Imagine you want an externally readonly variable. Set the set part of the property to private and boom - readonly variable. Imagine you want a constant variable that's dependent upon dynamic values. Don't specify a setter at all and its value can only be set in a constructor, using dynamic values. You could even create write-only variables as for instance a sort of blind-input mechanism by doing the same sort of stuff to the getter.
Comment on: Trigger input event automatically on value changed with code
Javascript makes me shudder.. Compare that to:
private string someStringValue;
public string SomeString
{
get
{
return someStringValue;
}
set
{
someStringValue = value;
DoTextChangedCallback();
}
}
It's funny in another thread discussing an interview problem, I posted an efficient solution and the person thought I was posting pseudo-code! Such a clean and clear language.
Comment on: Policheck - Remove the term "whitelist"
The typically charged title. "Fuck you, I quit. Hiring is broken"
Comment on: Policheck - Remove the term "whitelist"
And this is why people roll their eyes when somebody puts contributor or developer for open source project "blah blah" as a qualification in and of itself.
- 1. press ctrl+shift+h
- 2. put whitelist in the top box
- 3. put acceptlist in the bottom box
- 4. press okay.
That is literally all this person did. Maybe 10 seconds of work - 0 coding. Maybe less. Modern IDEs generally have awesome built in tools for refactoring so it may have been right click - refactor - name - okay. Things like this also make snicker when I see things like the recent article where somebody was outraged at instead of being asked to talk about their work on open source projects during an interview, they were asked to put their skills on display. They did not pass.
Comment on: [Why does my code work] I have created a parser in C and if I do not allocate and immediately free an array it breaks.
You're making assumptions about memory layout and allocation that are not valid. The result of malloc on a size of 0 is undefined. Don't do it. Exactly which region of memory malloc will allocate is also undefined. Don't make assumptions. You're doing both of these things. Allocate a global buffer and reuse that buffer obviously first nulling and ensuring that the buffer is large enough to handle your data. Don't repeatedly allocate and free. Don't make assumptions about what memory is allocated.
Assuming you want to add in more complex stuff, you also might want to check out LEX/YACC. Great tools for lexical analysis and execution in C.
edit Important note to add. Just because your code functions in one environment in C does not mean your code works. Your code in this project does not work. It assumes the result of a malloc of 0 which means the code is broken. Whether or not it gives you the right output for one specific scenario in one specific environment is immaterial to the question.
Comment on: Which programming language to learn first [infographic]
I think it's the logical benefit of starting low level first. I inadvertently started with assembly first. Got a throw-away 5" green monochrome IBM from my step-dad's company. Started running all those neat little "*.com" files and ran into debug.com. Ended up buying a an OS book at Half Price Books back in the day when they'd share little TSR, terminate and stay resident, programs in plain hex text. Entered it into my precious debug.com to create a fully functioning program, and I was hooked.
I can't imagine how somebody who starts on something like javascript is ever going to grasp programming if they build their subconscious intuition around that language. Starting high level is well intentioned advice since it's easier and ideally closer to natural language, but it seems like the quality of programmers this sort of education produces could be indicative that it might not be the best idea.
Comment on: Why can't programmers... Program?
Being excessive defensive and near immediately resorting to ad hominem from the beginning speaks clearly enough of your authenticity.
Hint: Look at the deletion time. I deleted my comment before you even responded to it.
Comment on: Why can't programmers... Program?
The reason I deleted it is because this is incredibly predictable. I know what I'm saying is true. What you're saying runs contrary to this, and thus I expect you're probably lying. The same is likely true if for reasons you are clearly not capable of explaining, what you're saying is true. Thus, it inevitably leads to a pointless back and forth of "nuh uh, uh huh!"
Comment on: Why can't programmers... Program?
I deleted my comment for that very reason, knowing you were going to double down on this. Up to you.
For what it's worth the Google interview process is also not a solitary interview, unless you were being weeded out in the prelims. It's a multistage interview process where you would typically be interviewed by 5 or 6 different individuals in the department you were applying for.
Comment on: Why can't programmers... Program?
What if the guy who went to MIT and worked at Google and who authored 3 open source projects can't do fizzbuzz? How do you explain away his success?
Not gonna happen. I think most folks would agree, love or hate Google, that they have and continue to collect some of the top talent in computer science. And their interview process is specifically geared towards skills over charisma.
And I think this is important. Verifying something that I think most people already know, a sociological research paper from not that long ago showed that narcissists have a vastly better outcome in the traditional interview system than more or less anybody else. Having people talk about themselves or answer questions they've had weeks to research and prepare for is completely pointless. When you ask somebody what the biggest challenge they overcame and how they did so you're not learning anything about that person aside from what they researched on the best sort of answer to this question. They could very well invent problems they never really had if they could think of a clever solution for it. It's all completely fake, like most interviews.
The Google interview process is geared towards pushing the interviewee into a technical problem they're not familiar with and seeing how they overcome it. If you can't 'fizz-buzz' you're weeded out in the preliminary interview. The point of this is that their talent results show this is absolutely a good way of interviewing.
Comment on: Open source versus corporate interest
I think the reasons behind AI, in particular, go far beyond what you're looking at. Many companies, most assuredly including Google, know full well where AI is taking us, and could take us. I think OpenAI phrases it quite eloquently in their mission statement. OpenAI is a billion dollar nonprofit started by many of the biggest names in Silicon Valley with the specific goal of not having to make a profit and instead working to ensure that AI is something that can and will be utilized to the benefit of all of society and not just an elite few.
This isn't also just singing kumbaya. The finish line for advanced AI is not really the finish line, but the starting point of an entirely new game. We're not competing to build a racecar so it can go drive around in circles against other racecars. We're competing to build a racecar so we can take it out into the wild and then do amazing things with it. AI isn't the goal, but the medium! And once advanced AI is a reality it's going to completely reshape everything we do. A company like Google would be extremely well positioned to be one of the biggest players in this brave new world, so open sourcing and pushing AI forward as fast as possible is something that is not only great for society, but also makes sense even from the perspective of their own self interest.
Comment on: Programmers how do you tackle the feedback problem?
You seem to not know the meaning of anecdote.
Comment on: Programmers how do you tackle the feedback problem?
You're changing the goal posts now. Yes, you obviously do need to be able to support yourself somehow before you can begin producing your own works. However, OP is in a situation where he is already producing his own works.
Comment on: F*** You, I Quit - Hiring Is Broken
Oh no I didn't take it that way. I think it's great that it looks like pseudo code. I strive for clarity in code so that's definitely a compliment! In any case the algorithm is actually perfectly performant. The entire memory required is N pointers. You could make the traversal nonrecursive if necessary, but that would not an issue in anything except rather extreme circumstances. I think it's a pretty nice implementation!
As for the interview stuff, interviews at Google don't work like at a normal company. It's not like you have your interviewer who then deals with a line of 50 people chopping them off one by one like the headsman at his block. It's a full day affair full of numerous interviews with people in the department you're interested in. The questions are prepared and reused and designed to test the technical creativity of the applicant. The phone interviews are relatively trivial and are just to ensure somebody has a basic degree of competence.
In any case this individual is in no way is describing the scenario you are. He seemed mostly upset that the interviewer only focused on actually doing things instead of talking about things he'd done in the past. And this is by design. Most of these comments are coming from chats with a friend who worked as an interviewer at Google, as many of their engineers do. One night we had a conversation about why I'm working solo since my technical interests align pretty much perfectly with Google's pursuits. My complaint was pretty similar to yours and that I felt the entire hiring process was a charade that I didn't feel like participating in. It turns out that Google's entire hiring process, and the reason they arguably end up with more top talent than any other company, is because they've specifically developed a system to counter this charade. Occam's razor here is stating pretty clearly that this guy probably just not a particularly sharp coder.
Comment on: Programmers how do you tackle the feedback problem?
Interesting anecdote. I was just curious and decided to look it up. It turns out his movie, Riddick, netted 80% as much as Tokyo Drift at the box office, and then made tens of millions in DVD sales on top of it. DVD sales for Tokyo Drift were not immediately available.
I think a big problem in making a product for the consumer is that it's impossible to know the consumer. I mean most major corporations today have gradually come to be 100% about this process and they, even with effectively infinite resources for consumer research and all the fancy statisticians money can hire, can't seem to figure consumer habits out.
If you can make a product, forgetting about the market for a minute, that you find enjoyable - I think it's all but a certainty that there are also plenty of other people who would also find it enjoyable. These products that try to appeal to the mass market rarely achieve the results they're looking for. Rather than bringing in huge numbers of new users, more often they simply alienate users who would prefer more focused software.
Comment on: F*** You, I Quit - Hiring Is Broken
Once again, I am not stating that the scenario you are describing does not exist somewhere. I am stating that the big named example of this article was not one of these employers. As an aside, that is compilable C# code - not pseudocode.
Comment on: F*** You, I Quit - Hiring Is Broken
I think you're kind of repeating yourself there. I get what you're saying, but what I'm saying is that his big lede example was not this sort of scenario that you're describing.
Inverting a binary tree is a really trivial problem that again shows whether or not somebody is capable of algorithmic thought. He even mentioned doing it on a whiteboard where he could have easily fleshed out his thought processes even if he couldn't produce the code 'cold'. I don't think I've ever had to invert a binary tree (I actually had to look it up to make sure it was what I assumed it was) and decided to give the problem a quick spin.
void Invert(Tree root)
{
Dictionary<Tree, Tree> parentLookup = new Dictionary<Tree, Tree>();
FillDic(root, null, parentLookup);
foreach (var entry in parentLookup)
{
entry.Key.left = entry.Value;
entry.Key.right = null;
}
}
void FillDic(Tree root, Tree parent, Dictionary<Tree, Tree> dic)
{
if (root == null)
return;
FillDic(root.left, root, dic);
FillDic(root.right, root, dic);
dic.Add(root, parent);
}
10 lines of code, and it works. The difference here is that in your example you're putting as a prereuisite "anybody who has done significant RDBMS." The prerequisite for this problem by contrast is being capable of thinking in a logical and algorithmic fashion and having a grasp on the most fundamental computer science datatypes. Furthermore solutions like mine which are not going to be the most pin point optimized give a clear insight into my thought process. "The only complexity in this problem is access to the parents. We need to get the parents in a fashion where we can iterate the nodes and mutate them without screwing up the iteration." From there it's wham, bam, done.
Comment on: F*** You, I Quit - Hiring Is Broken
This may be the case, but I know for a fact that I what I said applies to one of the companies he listed by name - Google. While there certainly may be some issues with hiring at some places, I think there's also another issue at play here. In computer science there are a large number incredibly skilled individuals since for many the work itself is intrinsically rewarding. This means some otherwise smart and skilled, but not as skilled, individuals are going to end up being second tier candidates. At most companies they'd still be in the upper tier of engineers, but at a company that attracts all the top talent like Google - they're not even going to get hired. And I have 0 doubt that some percent of these people are going to reconcile their failure not as a failure of themselves but as a failure of the hiring system itself - How could they possibly not be hired?!? Clearly the problem must have been in the hiring system itself!
Comment on: F*** You, I Quit - Hiring Is Broken
Again I don't even think you need to have BFS memorized. As long as you conceptually understand how a breadth first search works it's trivial to throw one together. I mean as long as you grasp that it's peers before children - which ought be entirely intuitive, the rest is trivial.
Comment on: F*** You, I Quit - Hiring Is Broken
This individual seems to be rather clueless about the goal of interviews. When they ask you a question, the intention is to ask something that you don't know off the top of your head to try to get a bit of insight into your thought process. A maze solving problem is a perfect example of this. They're not asking you to recite A*, they're asking you to share how you'd go about solving it. So you might start by drawing a simple maze and interpreting your own thoughts into general control flow and then convert that into an algorithm. In fact simply code dumping A* would probably result in the interviewer asking a different question as that response is almost as useless as "I'm not a recent college grad. You expect to have this memorized?!?!?" The fact this individual is seemingly incapable of producing algorithmic thought without having memorized it is most likely the reason they were not getting hired for anything. All the above applies 100% for Google who has the reputation as a tough interviewer. Your credentials get you in the door, they don't get you the job. Your display of skills and logical/creative thinking gets you the job.
Comment on: Have Software Developers Given Up? (an interesting read and so are the comments)
In my opinion a big problem right now is that the tools and languages for web development are just not particularly good. Javascript is the standard for client side execution. It's a weakly typed error tolerant dynamically interpreted language. Awesome for throwing together small little programs and having them work... somehow. But it's a complete disaster when you start getting into complex systems. Oh yeah and the implementation of javascript varies by browser and even version of browser. I mean this is the standard... really? I think even the entire paradigm of a stateless web is somewhat dated. Yes you can simulate a stateful experience, by it'd be a million times easier going the other direction and having a stateful system simply not maintain state!
I think even HTML is somewhat dated. Why not simply have sites with a default strongly type throw-a-bitch-fit-on-error bytecode type language standard? Now suddenly any language that can compile into the bytecode can be deployed to the web. Byte code with a clear standard would also be more likely to result in more consistent browser implementations across technologies. As it's not just javascript that's inconsistent - even the HTML implementations between browsers (and browser versions) have... inconsistencies. This would also be likely to result in more performant and even secure webpages. How many billion javascript injection and similar type exploits are there? This would also be nice from an enduser perspective as you might not end up with 95% of sites looking like slightly different variations on the like 3 different cookie cutter layouts. Consistent interfaces are nice but there's a line between that and having everything look damned near identical.
Comment on: We need more programming challenges. We should start off small: First non-repeating character of a string. Any language you like.
C# without LINQ
void Foo(string s)
{
foreach( var c in s)
if(s.IndexOf(c) == s.LastIndexOf(c))
{
Console.WriteLine(c);
return;
}
}
Comment on: Question about Intellectual Property
It's going to be stated in the contract. In the US the developer of code has an inherent copyright on that code. So companies will 'always' require you to waive this right. Some tend to get over zealous though and do things like claim ownership of all code you write while an employee for them. You could write some code while on a month long vacation in the Bahamas, and they could still claim ownership of it. Check with your company. If you think the software stands a very good chance of being able to be commercialized then check with a lawyer instead.
Comment on: Finding Entry Level Positions
A good place to start might be with your college's alumnus resources department. The department computer science in was at your university might also have job placement assistance available. Go check their office. Another good resource are your friends and acquaintances from college. Especially in today's job market who you know is in many ways more important than what you know. Aside from all of that setup a site and host some projects demonstrating your skills.
Comment on: What is the best language for someone who wants to learn to code for the first time?
A lot depends on your goals. Would you like to be able to write large scale systems and complex components? Just academic curiosity? Do you just want to be able to write a little game that runs on a webpage?
If you're interested in being able to go all the way to the top, I'd recommend starting with assembly. The big benefit for assembly is that you naturally start to understand why things like encapsulation, objects, and so on exist as opposed to just learning the OO bible and taking it as sacred fact. It also gives you a far better understanding of what's going on under the hood which can help you write more performant and well designed code when using higher level languages. MIPS has a simple assembly syntax and instruction set that's often used for educational purposes. So you'll have countless tutorials, projects, and so on available for learning. Just google for 'MIPS simulator' to find something to run it on.
If you'd rather just get something that's easier to chew on I'd go for C. After that it'd be time for C# - it's cross platform and has done a great job of taking most all desirable features from most languages, cleaning them up, and still keeping them in a very attractive syntax. But it's important to have an intuitive understanding of what's happening under the hood in order to write the most 'effective' code, which is why I think it's critical to start with lower level languages.
On the same theme, even if you just want to throw together some stuff I'd have trouble recommending javascript. It's not really like any other language and may lead to some incredibly bad habits. It's performance is abysmal and as projects grow in complexity it quickly becomes intractable. Coding is a lot like language in that a lot of the real work is done subconsciously. You know when you formulate a sentence you're not sitting there thinking about subject-verb agreement, adjective placement, tense agreement, and so on. That all happens automatically and subconsciously. The same is true for a coder working on a large project. So I think getting your fundamentals down 'correctly' is critical to make sure you don't hamstring yourself.
Comment on: Help with C# questions?
Here's a fairly straight forward but likely challenging (and hopefully fun!) beginner level problem.
-
Create a base type "Monster" class with a few public properties - health, defense, and offense. There will also be a virtual method called Rest that lets the monster heal a bit. Create a few types of monsters that inherit from this class all that have different default stats defined in the constructor and implement Rest differently. Perhaps some might even gain some offense or defense once they test!
-
Create an Arena class. This class will have a method: public Monster Simulate(int numberOfMonsters) which will create an arena with "numberOfMonsters" random monsters. The arena class will simulate rounds by having two monsters fight each other until one is decided the victor. So if there are 16 monsters, you'll have 8 fights simulated in the first round. 4 in the second, and so on. After each round the winners keep fighting but get one call to Rest(). It returns the winning Monster. What happens during a round with an odd number of Monsters?
-
Create some interesting output by narrating what happens during each fight including outputting the type of Monsters fighting.
This shouldn't be too hard but would sufficiently test a basic understanding of classes, inheritance, and the reason and implementation of using multiple files.
Comment on: (Brian Will) Object-Oriented Programming is Garbage: 3800 SLOC example
In your first bullet point, I think object interaction is solved nicely with a registry system. In cases where the objects are logically interweaved but also distinct I'd go for a manager system. As for the second bullet point, since the functionality in question is clearly at least semi-general usage why not remove it from the lower class and place it your general utils library for the project?
For the string problem, I'm probably missing something but if you want a string type specialization why not simply inherit/extend the class? Alternative in C# there are expansion methods where you can actually extend the functionality of a class without actually creating a new class. It's a neat and clean solution as well. It would require a new interface, but I think when the behavior may be different than what people expect that's actually a good thing.
Comment on: (Brian Will) Object-Oriented Programming is Garbage: 3800 SLOC example
The comment about "navigating a file that is thousands of lines of code is a nicer problem than navigating lots of small files" seems insane. Given a modern IDE it's akin to arguing a book without a table of contents is better. Finding references, going to method definition, etc are all literally a matter of one key press and you get the benefit of a file hierarchy that can be made more granular and clear. It reminds me of people that argue against incrediblyLongVariableNamesAreBad because it's more typing, yet with a modern IDE it's comparable (and often less) typing than ilvnab with millions of times more clarity. He's also not really considering maintainability/extension and code reuse.
Comment on: Does the Go programming language have any future?
I think so many languages come and... go... because there's no compelling need for them. Just do the "me" test. You're more or less just like any other developer out there. So go look over a synopsis of the language. If after doing that it's compelling to you then it's probably compelling to enough other people that it will have a future. If, on the other hand, your main compelling interest in it is because of what somebody else said then there's no reason to think it'll be anything but another short (or not so short) lived fad.
Comment on: Another bigot joining Github. Inclusiveness doesn't include white men.
A white male that undergoes surgery to remove their genitalia and ingests artificial hormones in an attempt to physically resemble a woman who then goes on a nonstop tirade ranting against whites and males. No, no self loathing or mental disorder here at all - just gender dysphoria.
I also don't understand why companies hire folks like this. In spite of what you'd believe from reading Twitter and social media in general, there are completely normal and rational trans folks. This person by contrast literally thinks there is a secret world power organizing against her. This is not a stable person. In any case if you're truly after diversity would you not prefer individuals who are not actively sexist and racist? People like this are going to create one nasty work environment.
Comment on: What constitutes 'coding'?
There's some complications in his answer though. For instance C# can be used for scripting or for coding which is a really cool feature. You can compile standalone executables (which are actually in an intermediary interpreted language format which is later translated to machine code at run time) but you can also write code that takes in arbitrary C# code and interprets it at runtime! See for instance this C# interactive shell.
This question would have been a whole heck of a lot easier to answer in pre-java/just-in-time-compilation times!
Comment on: Programmer quits work on project after getting triggered by a variable name (The comments, however . . .)
I love how invariably the qualifications for the drama queens are:
"[In 4 years as a developer] I've developed or helped develop 21 packages downloaded over 33,000 times [and then lots of stuff that has absolutely nothing to do with programming]"
So they contributed to 21 packages that have been downloaded an average of 1,571 times each. In 4 years. And I have 0 doubt a non-zero number of those 'contributions' were similar to the amazing value offered by changing a variable's name because it offends you. I'm not sure how the community will get on without such an amazingly productive member.
Comment on: How Microsoft Lost the API War - Joel on Software
Cripes. This entire article makes him seem like a soothsayer. Throughout the article, which was written 12 years ago, there are numerous references to goals, strategies, and issues for Microsoft. And nearly all are more meaningful than ever. I enjoyed this bit:
It's so important for Microsoft that the only reason they don't outright give away development tools for Windows is because they don't want to inadvertently cut off the oxygen to competitive development tools vendors (well, those that are left) because having a variety of development tools available for their platform makes it that much more attractive to developers. But they really want to give away the development tools.
Even after some research I was unsure why Microsoft ended up giving away their dev tools starting with a trickle to the latest state which has them giving away Visual Studio "Community" which is effectively Visual Studio Pro, and even open sourcing their C# platform code. I guess I didn't bother checking out 12 year old articles!
VS Designer crashing. Designer doesn't support break points. I get to restart the entire friggin IDE and reload the project and repeat one line at a time to see what's causing VS to crash. Thanks MS.
1 0 comments 07 Jan 2016 08:44 u/rwbj (self.programming) in v/programmingComment on: What to do with a weak supercomputer?
Raytracing is awesome for super computers. Now the cool thing about raytracing is not only does it work for images, but a project I previously worked on was raytracing audio. Give each surface resonant properties and you can organically create incredibly realistic audio effects like echoes or various other aural distortions.
Comment on: What kinds of problems is /v/programming solving these days?
It really doesn't matter what sort of problems you want to solve. Once you learn the fundamentals of programming it's pretty easy to go from one language to another. My language of choice is currently C# which when I was able to jump in and start using immediately even though I'd never read anything extensive about it before and had never used it before. There used to be a pretty big gap between imperative languages like the "C" family and functional languages like Haskell or Lisp but now a days the biggest languages, like C#, have started to even take some of the handy concepts from functional languages and bring them on over.
Anyhow if you ever hope to become particularly competent at programming it's going to turn into a love and so the exact language you start with doesn't really matter - you'll devour everything. This question is kind of like somebody who's never played baseball before asking if they should start out by learning a curveball, a fastball, or whatever and somebody responding "Well what sort of batter are you pitching to?" Just get to playing and see if you enjoy it before getting into the esoterica.
Comment on: How to learn basics of sql in a simple way?
To what end? If you just need to use a database in your project then I can have you going in a paragraph or two:
-
Transactions are your friend. A transaction is a chunk of statements that are all executed synchronously. If one statement fails (for instance the database becomes locked or inaccessible) then the whole transaction is generally rolled back to ensure consistency of the database. It's not only for safety but also performance. Without transactions for each operation you must open the database, access/write what you need, close the database. Using transactions increases performance by some orders of magnitude.
-
Databases should be indexed. Indexes allow data to be searched for in ideally logarithmic time. Create indices on the columns you plan to search by routinely. Columns with high cardinality (or repeated values) make poor indexes for grabbing a distinct row. You generally do not want to go index crazy and index everything as each index requires space and can begin negatively impacting performance.
-
SQL has caching, but as its general purpose its performance is less than amazing. And if you're storing anything complex in your database then there will be a significant penalty on marshaling/deserializing (or putting your objects in/out of the format you put them in when writing/reading from the database) your data anyhow. In other words avoid using the database so much as reasonably possible. It can very quickly become a bottleneck.
-
SQL is, at least in my opinion, an incredibly ugly and awkward language. It's one of those languages that kind of sort of wanted to try to be understandable/usable by non programmers, but reality made that impossible and so you end up with a syntax that just feels bloated and less than concise for the sake of natural language readability that's not really readable. It's gone through nearly 5 decades of revisions, and it shows in the interface. Once you get the basics down - toss a wrapper around the functionality in the language of your choice, and never touch SQL again.
Comment on: Github disables repository for using the word "retard."
I also posted about this here.
In particular I'd strongly encourage anybody who thinks this is unreasonable to shoot them a comment. Github can be contacted at https://github.com/contact
Ultimately I think political correctness has no place in development. If you don't like a project using the word retard then don't support the project. If you want to make a "safe space" version of a project then you can fork it and do exactly that. We already have enough to worry about ensuring license compatibility, that we haven't accidentally violated some obscure software patent, and so on. And now we get to add worrying about ending up saying some word or phrase that isn't allowed by the latest incarnation of political correctness to the list of hurdles we have to navigate. In the end it does little more than retard progress.
Comment on: Computer Programming to be renamed Googling Stackoverflow
I mostly just love the defensive "real programmer" comments on the comments page of that article, though a few are even showing up here as well. Languages are little more than tools. Memorizing the nuance and breadth a language makes one no more a good developer no more than memorizing a dictionary makes one a good writer.
Comment on: Visual Studio 2015 and .NET 4.6 Available for Download
I was reluctant to check out 2015 without any real pressing need, but I'm glad I did.
So far there's already some nice little QoL type improvements. For instance unnecessary using/references are automatically flagged and you can remove all of them from your entire project with a single click. Pretty sleek. Something very cool is what they've done with source control. It's all natively built in and sleek. In VS2013 you were forced to use things like gitextensions for clean source control support. In 2015 it's all inline and for instance you can immediately check out a seamless diff, check out a version history, etc all with a click or two. Very sleek again. And there's some pretty useful little C# language improvements:
int Blah { get; set; } = 3;
Something that everybody's typed at some point and was stupefied when it didn't work is finally valid code! Another cool feature is nameof:
int foo = 123; string s = nameof(foo);
s is, unsurprisingly, now equal to "foo". There's a lot of really interesting possibilities there. At the very minimum a basic debug output that lists the value of parameters can now survive and make sense even through radical refactoring in the code. Again, sleek.
This is from a a couple of hours with the code and probably just scratching the surface. I guess summing it up in one word so far would be... sleek.
Comment on: Come up with the most inefficient, poorly written, and complex way to print out "Hello World!".
Yip. It's also kind of shocking how powerful it is even given how incredibly intentionally handicapped the version I wrote is. You're basically playing a game where a person tells you to come up with a statement, case sensitive, that you have no idea about and are given nothing but a number representing in some way (which is not explained) how close you are to being correct. A program being able to solve that game would probably have been considered intelligence at some point in the not so distant past, but like all AI achievements we keep shifting the goal posts just a little bit further each time they're passed.
Comment on: Come up with the most inefficient, poorly written, and complex way to print out "Hello World!".
Sample output where it just couldn't get it:
Hello World!
Generation 0 most eligible bachelor with a suitability of 302 is: 60cse9EbTY
Generation 1000 most eligible bachelor with a suitability of 10 is: Hbmln Xnslb!
Generation 2000 most eligible bachelor with a suitability of 7 is: Hcllo Wnnld!
Generation 3000 most eligible bachelor with a suitability of 11 is: Iejlo Wnskh
Generation 4000 most eligible bachelor with a suitability of 6 is: Felpo World!
Generation 5000 most eligible bachelor with a suitability of 8 is: Gello Voplh!
Generation 6000 most eligible bachelor with a suitability of 6 is: Hfllp Woqmd#
Generation 7000 most eligible bachelor with a suitability of 12 is: Fejlm Unrod!
Generation 8000 most eligible bachelor with a suitability of 11 is: Fcnmo Wnrnd
Generation 9000 most eligible bachelor with a suitability of 5 is: Idllo Wqrkd!
Generation 10000 most eligible bachelor with a suitability of 12 is: Efjlo Xproc!
Generation 11000 most eligible bachelor with a suitability of 11 is: Hejlo Xqslh
Generation 12000 most eligible bachelor with a suitability of 7 is: Helio!Would!
Generation 13000 most eligible bachelor with a suitability of 9 is: Hemjo#Uorld
Generation 14000 most eligible bachelor with a suitability of 4 is: Hemmo Vorle!
Generation 15000 most eligible bachelor with a suitability of 12 is: Ijklo Wprie!
Generation 16000 most eligible bachelor with a suitability of 9 is: Gdklp!Uoqle!
Generation 17000 most eligible bachelor with a suitability of 7 is: Hfono Wosld!
Generation 18000 most eligible bachelor with a suitability of 7 is: Hemlo Wopme#
Generation 19000 most eligible bachelor with a suitability of 5 is: Hemmo Uorle!
Generation 20000 most eligible bachelor with a suitability of 8 is: Ielhp Wprlc!
Generation 21000 most eligible bachelor with a suitability of 12 is: Ldlkp!Woqmf!
Generation 22000 most eligible bachelor with a suitability of 10 is: Gfmlm Xmrlb!
Generation 23000 most eligible bachelor with a suitability of 3 is: Helmo Xormd!
Generation 24000 most eligible bachelor with a suitability of 12 is: Fgmlo!Vntnd!
Generation 25000 most eligible bachelor with a suitability of 10 is: Gdnlo Wosoe
Generation 26000 most eligible bachelor with a suitability of 12 is: Hemnn Wnuie!
Generation 27000 most eligible bachelor with a suitability of 7 is: Hekln#Wnsld!
Generation 28000 most eligible bachelor with a suitability of 8 is: Heplp Worod!
Generation 29000 most eligible bachelor with a suitability of 13 is: Jgkoo Xosmf!
Generation 30000 most eligible bachelor with a suitability of 9 is: Hejlo Wmskb
Generation 31000 most eligible bachelor with a suitability of 9 is: Hfkjp Xpsld
Generation 32000 most eligible bachelor with a suitability of 9 is: Helkm Xlrjd!
Generation 33000 most eligible bachelor with a suitability of 5 is: Hello Vorka!
Generation 34000 most eligible bachelor with a suitability of 11 is: Helko Wqkmd!
Generation 35000 most eligible bachelor with a suitability of 15 is: Hehnm Xptmb!
Generation 36000 most eligible bachelor with a suitability of 8 is: Helko Rorme!
Generation 37000 most eligible bachelor with a suitability of 11 is: Egmko Xmrlc!
Generation 38000 most eligible bachelor with a suitability of 10 is: Hejlp Vmrjb!
Generation 39000 most eligible bachelor with a suitability of 11 is: Hclmp Xmtlf!
Generation 40000 most eligible bachelor with a suitability of 9 is: Femmo!Vnrlf!
Generation 41000 most eligible bachelor with a suitability of 12 is: Heilq Wprjd%
Generation 42000 most eligible bachelor with a suitability of 6 is: Hemlo Vosla!
Generation 43000 most eligible bachelor with a suitability of 10 is: Efmlp Vnqkd!
Generation 44000 most eligible bachelor with a suitability of 9 is: Kdjlo Xoslc!
Generation 45000 most eligible bachelor with a suitability of 9 is: Hgllo$Wrrld!
Generation 46000 most eligible bachelor with a suitability of 12 is: Galno Wptlb!
We have a winner from generation 46417!!
Hello World!
Comment on: Come up with the most inefficient, poorly written, and complex way to print out "Hello World!".
Hello world you say? How about a sort of genetic algorithm to try to intelligently come up with Hello World! knowing and using nothing except a customizable suitability method?
The following code is written for a C# forms window. Simply add a textbox named outputTextBox and a button named goButton with a click event goButton_Click. Alternatively it'd be trivial to port to a console app, but I'm lazy(?). Variables that might be fun to play with are consted and all caps. This program probably has bugs and is undoubtedly poorly written. Enjoy!
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.Windows.Forms;
namespace HelloWorld
{
public partial class Form1 : Form
{
public Form1()
{
InitializeComponent();
}
int GetClosenessMeasure(string output)
{
const int missValue = 50;
const string target = "Hello World!";
int result = 0;
int length = Math.Min(output.Length, target.Length);
for (int i = 0; i < length; i++)
result += Math.Abs(target[i] - output[i]);
result += Math.Abs(target.Length - output.Length) * missValue;
return result;
}
void SeekTheGoldenChild()
{
const int MAX_SURVIVORS_PER_GEN = 50;
const int CHILDREN_PER_GEN = 10;
const int OUTPUT_INTERVAL = 1000;
int generations = 0;
bool done = false;
List<Entity> generation = new List<Entity>();
for (int i = 0; i < 100; i++)
generation.Add(new Entity(GetClosenessMeasure));
while (!done)
{
generation.Sort((a, b) => a.MySuitability - b.MySuitability);
if (generation[0].MySuitability == 0)
{
WriteLine("We have a winner from generation " + generations + "!!" + Environment.NewLine + Environment.NewLine + Environment.NewLine + generation[0].MyDNA);
done = true;
}
else
{
generation.RemoveRange(MAX_SURVIVORS_PER_GEN, generation.Count - MAX_SURVIVORS_PER_GEN);
if (generations % OUTPUT_INTERVAL == 0)
WriteLine("Generation " + generations + " most eligible bachelor with a suitability of " + generation[0].MySuitability + " is: " + generation[0].MyDNA);
List<Entity> tempGen = new List<Entity>();
for (int i = 0; i < generation.Count - 1; i++)
for (int j = 0; j < CHILDREN_PER_GEN; j++)
tempGen.Add(generation[i].MakeBaby(generation[i + 1]));
generation = tempGen;
}
++generations;
}
}
void WriteLine(string s)
{
Action a = ( () => outputTextBox.AppendText(s + Environment.NewLine));
if (outputTextBox.InvokeRequired)
outputTextBox.BeginInvoke(a);
else
a();
}
private void goButton_Click(object sender, EventArgs e)
{
Task.Run(() => SeekTheGoldenChild());
}
}
public class Entity
{
const int MAX_LENGTH = 50;
const string LANGUAGE = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789!@#$%^&*() ";
static Random rng = new Random();
public string MyDNA { get; private set; }
public int MySuitability { get; private set; }
private Func<string, int> mySuitabilityMethod;
public Entity(Func<string,int> SuitabilityMethod)
{
int length = rng.Next(MAX_LENGTH);
MyDNA = new string(Enumerable.Repeat(LANGUAGE, length).Select(s => s[rng.Next(s.Length)]).ToArray());
MySuitability = SuitabilityMethod(MyDNA);
mySuitabilityMethod = SuitabilityMethod;
}
private Entity(string dna, Func<string, int> SuitabilityMethod)
{
MyDNA = dna;
MySuitability = SuitabilityMethod(dna);
mySuitabilityMethod = SuitabilityMethod;
}
enum GeneticLotteryWinner {Me, Partner, Milkman};
private GeneticLotteryWinner GetGeneticLotteryWinner(int percentForMe, int percentForPartner, int percentForMilkman)
{
int roll = rng.Next(100);
if (roll < percentForMe)
return GeneticLotteryWinner.Me;
else if (roll < percentForMe + percentForPartner)
return GeneticLotteryWinner.Partner;
else
return GeneticLotteryWinner.Milkman;
}
public Entity MakeBaby(Entity partner)
{
const int DOM_LENGTH_ODDS = 60;
const int SUB_LENGTH_ODDS = 20;
const int MILK_LENGTH_ODDS = 20;
const int DOM_DNA_ODDS = 60;
const int SUB_DNA_ODDS = 20;
const int MILK_DNA_ODDS = 20;
Entity dom = MySuitability >= partner.MySuitability ? this : partner;
Entity sub = dom == this ? partner : this;
// we'll use a 60 : 20 : 20 ratio for breeding. The dom gets a 60% preference, followed by a 20% for the sub and a 20% for randomness
GeneticLotteryWinner lengthWinner = GetGeneticLotteryWinner(DOM_LENGTH_ODDS, SUB_LENGTH_ODDS, MILK_LENGTH_ODDS);
int length;
switch (lengthWinner)
{
case GeneticLotteryWinner.Me:
length = MyDNA.Length;
break;
case GeneticLotteryWinner.Partner:
length = partner.MyDNA.Length;
break;
default:
length = rng.Next(MAX_LENGTH);
break;
}
char[] dnaTemp = new char[length];
for (int i = 0; i < dnaTemp.Length; i++)
{
int domOdds = DOM_DNA_ODDS, subOdds = SUB_DNA_ODDS, milkOdds = MILK_DNA_ODDS;
if (i >= dom.MyDNA.Length && i >= sub.MyDNA.Length) { domOdds = 0; subOdds = 0; milkOdds = 100; }
else if (i >= dom.MyDNA.Length) { subOdds += domOdds; domOdds = 0; }
else if (i >= sub.MyDNA.Length) { domOdds += subOdds; subOdds = 0; }
GeneticLotteryWinner dnaSpotWinner = GetGeneticLotteryWinner(domOdds, subOdds, milkOdds);
switch (dnaSpotWinner)
{
case GeneticLotteryWinner.Me:
dnaTemp[i] = dom.MyDNA[i];
break;
case GeneticLotteryWinner.Partner:
dnaTemp[i] = sub.MyDNA[i];
break;
default:
dnaTemp[i] = LANGUAGE[rng.Next(LANGUAGE.Length)];
break;
}
}
return new Entity(new string(dnaTemp), mySuitabilityMethod);
}
}
}
Comment on: Serious performance drop (30FPS) moving from VS2013 to VS2015 (both Community)
Awesome! Glad to be able to help.
Comment on: Serious performance drop (30FPS) moving from VS2013 to VS2015 (both Community)
Without patronizing you, I think it's always important to start with the "Have you made sure it's plugged in?" line of stuff first. My first guess here is that you accidentally built the libraries under debug or some sort of non-release settings. It's easy to test this. Take the binaries of the libraries and plug them into your old project and see what happens. If everything's running just dandy there then you can at least isolate the problem to the new base project. And on that topic of course also check for similar optimization/release settings for the new project itself. I'm not sure if these are carried over consistently in the import process.
Comment on: What can help me transition from intermediate web OOP to game OOP
Any game regardless of complexity is going to have an identical basic architecture. Well at least any local game. Get into games where you're running a dependent client and not a stand-alone application and the architecture becomes different, but even a game like Civilization is going to have an identical architecture at the most fundamental level. Modular design, design patterns, and all of this is an entirely tangential discussion. There are an enormous amount of details and concepts in between but that is a matter of design and ultimately implementation - not a broad spectrum layman's consideration of the way to begin to think about how it works.
Comment on: What can help me transition from intermediate web OOP to game OOP
Haha, interesting. Coming from a game/application background this is such a strange question but I suppose it makes sense. I imagine the paradigm in web development is an infinite event driven system. Games are much more simple from an architectural (although generally vastly more complex from an implementation) point of view. It's just a state system with constant looping in the game itself.
- state intro = display splash go to menu
- state menu = display menu, poll for input
- state menu.game started = display game intro, go into while(playing) {do some stuff} then go to menu
- state menu.game exited = bye
The game loop itself is going to be made up of input querying, AI updating, physics updating, rendering, and so on. That is where the actual complexity comes in but from a broad architectural point of view games are trivial.
Comment on: The programming talent myth
I love how completely out of touch some of these social justice articles are.
When we see someone who does not look like [a young fit clean cut white male], we assume they are not a real programmer, he said. Almost all of the women he knows in the industry have a story about someone assuming they aren't a programmer. He talked to multiple women attending PyCon 2015 who were asked which guy they are there with—the only reason they would come is because their partner, the man, is the programmer. "If you're a dude, has anyone ever asked you that?"
As a hint, when some guy at a sausage fest event (all with a collective fantasy of meeting a similarly interested and intelligent girl) ask a girl which guy she's with, it's not because of some sort of pseudo-societal stereotyping and implicit dogmatic oppression.
Comment on: The C# "bug" that will catch you.
It's been fixed in foreach loops. It's still present in for loops.
Comment on: The C# "bug" that will catch you.
Think about what you're saying. When you're arguing that something is okay because it's a matter of "education/testing" you're essentially engaging in circular logic along the lines of, "This works this way because it works this way." Many things in C# could have been simply written off as "education/testing" things but they're not.
string s1 = "dog";
string s2 = "dog";
bool hmm = s1 == s2;
A very simple concept, but actually very pertinent. Strings need to be implemented as reference types for obvious reasons. However, the intent of that line of code is looking to compare pointers close to 0% of the time. Java stayed true to consistency and that statement would return false. C# took the path of intent and that statement returns true with a opaquely overloaded == operator.
Languages are tools. They're nothing more than hammers. Of course you do need to spend some time learning aspects of various languages, but the reasons for that learning should be immediately attributable to various benefits. When they're not, it's not a matter of teaching the blacksmith how to work around the tool's faults - it's a matter of improving the tool. The one reason I could see anybody defending this is that any fix would inherently break old code. And I suppose that's a fair point. A mistake that we can learn from when the standard for the next generation of tools is being developed.
Comment on: The C# "bug" that will catch you.
Exactly, and C# is not Javascript. It's not idiomatic. It provides 0 benefit. It's just a gotcha that should have been fixed long ago.
Comment on: The C# "bug" that will catch you.
Well you're comparing things that are very different. In javascript there's no such thing as block scoping so this behavior is somewhat expected. In C# everything is block scoped. For instance, the following code is actually a compile time error:
{
{
int a = 3;
}
SomeMethod(a);
}
As is the following:
{
for( int i = 0; i < 10; i++)
i++;
SomeMethod(i);
}
The C# "bug" that will catch you.
7 13 comments 03 Apr 2015 09:36 u/rwbj (self.programming) in v/programmingComment on: Help me test my school programming assignment
The uBlock ad blocking extension seems to be interfering with usage of the site. Everything loads and I can see it all just fine, but sending didn't work until I disabled uBlock.
ctrl-shift-s