Popular Culture:
Science, Software, Hardware:
Zeroth World:
Comics/Images/Audios/Videos:
The world wants to boycott Germany? Or at least say so? Well, apparently they want to boycott buying German products, not getting German money.
It is easy to blame someone, especially someone with a history as bad as us Germans'. What I saw in this report about the poor Greeks' situation (having to give their children away) shocked me. And while there might be nationalism in Germany, most Germans would not agree to measures that lead to something like that. They just don't know that it is really that bad.
Don't forget: Many Germans are poor too - mostly not that poor apparently, but still poor. And their only daily lecture is Bild - which tells them that the lazy Greek people take away their money. The reason that poor Greek people are angry about Germany is similar to the reason that poor German people are angry about Greece: They think that it causes their bad situation.
So just today I read that Apple could have bailed out Greece twice. And I wonder: Why not #boycottApple then? But again, it is not Apple alone. I am sure there are many companies that could do so. And I am sure that there are many companies that even caused Greece's situation. Why don't we all just boycott them? Why not #boycottCapitalism?
Could it be because we all secretly love the current kind of capitalism? Greece benefited from it in the past, as well as Germany did. As Volker Pispers points out: It is the modern religion, you don't have to understand it, you just have to believe it.
While I can pity single persons, I do not pity the Greece people on the whole. For decades they accepted this system. And when it was too late, they elected extremist parties. I am sure if it was the other way around, they would have a government that would behave like Germany does now, maybe a little better, maybe a little worse.
And I am sure the day will come for many of the German people who cry about our precious German money being given away to greedy Greeks to be in need of help themselves. Hopefully the system has changed till then.
Every nation gets the government it deserves. (Joseph de Maistre)
External Content not shown. Further Information.
I am not a professional software developer — well, I guess in some sense I am, but at least not an industrial professional software developer. But at least what I heard from some industrial software developers, what I will describe now is a usual experience:
You develop a complex piece of software. Especially the "practical" pieces of software are usually evolutionally grown and very complex. And usually, when writing code, you write something very specific and nontrivial – because if it was trivial, there would usually be no reason to write it, except exercising, which is also pointless if the solution is trivial to you.
So what happens to me often (and as far as I heard, I am not alone with this) is that I wrote a large piece of software, and then compile and test it – just to eliminate the worst bugs and lots of small mistakes. I really hate this phase of writing code.
So recently, I had to optimize a functional algorithm. I did so, and it was really hard, it took me a few hours. Then I leaned back and had this feeling that I always have before the first debugging phase.
But just a few moments later I remembered: The code I have written was functional for a reason: The entire algorithm was formally verified against a specification which I already had debugged and reviewed thoroughly. Admittedly it took me longer to write verified code than it would have taken to write unverified code, but the whole phase of testing and debugging is practically nonexistent.
External Content not shown. Further Information.
There might be some duplicates from past link lists due to a broken computer and Firefox profile fsckup. Sorry about that.
Popular Culture:
Nerd Culture:
Science, Software, Hardware:
Zeroth World:
Comics/Images/Audios/Videos:
Quotes:
External Content not shown. Further Information.
Popular Culture:
Nerd Culture:
Science, Software, Hardware:
Zeroth World:
Comics/Images/Audios/Videos:
Quotes:
Induction is one of the nightmares of many computer science students, it seems. For mathematicians, it is a very natural concept, and often the first higher proof strategy. Induction itself is not hard, the problem is that most proofs involve a lot more work than just induction, and it can be hard to even find the right parameters for induction.
When doing type theory, one quickly notices that induction is somehow equivalent to recursion: Recursion operators have a type signature similar to the formulas which encode induction principles. This is not a coincidence - but recursion operators are not really intuitive either, for a beginner. However, I think that for many beginners who have at least a little knowledge about programming, the key for understanding induction could be recursion. Even though from a purely "practical" viewpoint, recursion might not be good programming practice (especially if you ask older programmers), it is a well-understood concept.
I assume that the reader knows what a function - in the sense of programming - is. I will use C notation here, but without pointer magic, to make it easily understandable.
A well known function is the factorial of a number. Mathematical
notation for the factorial of is usually
. For
, we can "define" the factorial as
. Furthermore, it is defined that
. A simple implementation of this function in C would look
like
int fact(int n) {
int ret = 1;
for (int i = 1; i <= n; i++) {
ret = ret * i;
}
return ret;
}
And in fact, this would calculate the factorial (and is probably the
way most software engineers would recommend). On the other hand,
notice that for
. This equation and
fully determine the values of the factorial function. So,
an alternative way to write a factorial function in C is
int fact (int n) {
if (n == 0) return 1;
else return n * fact(n - 1);
}
This function calls itself. This is called recursion, the function is called recursive. Notice that the function argument decreases. Behind this lies a simple principle: We can already calculate the function for a smaller argument, and we can use this value to calculate the function's value for the larger argument. We can reduce the problem to a simpler problem.
Another classical example for recursion (and especially for the
problems with recursion) is the Fibonacci sequence. We define
and
for
. This
is a recursive definition. In C, we can write
int fib (int n) {
if (n < 3) {
return 1;
} else {
return fib(n-1) + fib(n-2)
}
}
This definition has a major disadvantage: Its runtime is exponential (if you don't know what that means: really really slow). But it also has a great advantage: It is easy to write and easy to understand. The non-recursive definition looks like
int fib (int n) {
int j = 1, k = 1;
for (int i = 3; i <= n; i++) {
int l = j + k;
j = k;
k = l;
}
return k;
}
This function is (arguably) harder to understand. This is often the case, and this is a common problem that computer science tries to solve: Things which are easy for humans are hard for computers and vice versa. That is why modern programming languages have abstractions like functions, classes, modules, etc. The art of computer science is to find a common language that both computers and humans understand well. For the hardest parts, people rely on mathematical proofs.
And one principle that makes it easier for humans to reason is to reduce a problem on simpler problems. And this is what recursion can do.
Now back to induction. The classical introduction to induction usually
contains Gaussian sums: Assume we want to add the numbers from
to
, that is, we want to calculate
. This is similar to the factorial, except that we
use addition instead of multiplication. And of course, we can use the
same function as above, and just replace multiplications with
additions (and 1 with 0):
int gauss (int n) {
if (n == 0) return 0;
else return n + gauss(n - 1);
}
This function loops through all numbers below n
, it takes linear
time. This is not bad, but there is an easier way:
int gauss (int n) {
return (n * (n + 1))/2;
}
In fact, for all
. How do we know that this holds for every
natural number? We can calculate this for every single number, but not
for all numbers, as there are infinitely many natural numbers: We need
a mathematical proof.
Before we go into induction, let us define a function from natural
numbers to proofs. It is possible to do this formally, but not
without a deeper understanding of mathematical logic - so we just do
this informally here. Let us call this function . That is,
is a proof for
,
is
a proof for
, and in general,
is a proof for
. We will define
this function recursively.
For , we can just calculate this. So, we can say
.
For with
, we can do recursion.
proves that
. Using
this proof, we can set
We recursively defined a function that generates proofs. We did this in a very informal way, but with just a little more theory, we could do this in a fully formal way. (If you're interested, look at proof assistants like Coq or Agda.)
Denote by our proposition
. Notice that
is a proof of
.
Notice that our definition of proves
. Our
definition of
for the other cases uses
- a
proof of
- to prove
. So essentially, one
could say that the recursive definition is itself a proof of
. And notice that these are the premises
of induction: If
and for all
we have
, then we have
for all
.
There are more general induction schemes. For example, on binary trees, you can do induction on every branch - but the same way you can do recursion. The important part about induction (and recursion) is that something decreases, which cannot decrease infinitely often.
External Content not shown. Further Information.
Popular Culture:
Nerd Culture:
Science, Software, Hardware:
Zeroth World:
Comics/Images/Audios/Videos:
Quotes: