Filip Ter

Filip is an undergraduate at QMUL studying Computer Science


Scaling of applications is a common problem that many programmers have to deal with. Just because a solution works on a given input; doesn't mean it's going to work,with satisfactory performance, for input of any size. The same algorithm that sorts 100 integers quickly, might scale so badly that if given a 1,000,000 integers it might be so slow that it would become practically useless, or at least a big drawback to the application it's used in. Given this fact, one has to be aware on what scale their algorithms and programs will be used.

Language speed is often a major factor when determining how useful a programming language can be. Some applications rely on languages that can execute a large amount of instructions in a short amount of time, and programmers may in some cases use languages which have many unpleasant features, just because they can execute an algorithm faster. It would be interesting to see how some languages would fare when it comes to how long it takes them to execute the same algorithm.

Generating random numbers from computers is essential to the development of certain kinds of software. Anything from modelling the environment, to a lottery machine, to determining the value of loot in a chest in an RPG, will require random number generation. At first it may seem strange that computers, which are capable of producing massive amounts of digits in a short time, would not be able to produce random numbers. The difficulty is that the computers we use, are constructed specifically to follow logical steps deterministically, so to generate numbers that are truly random from a system like our computers is virtually impossible. True randomness cannot be obtained using arithmetic operations, which is exactly what our computers perform. When some sequence of numbers is random then it is not possible to predict what the next digit will be, and since computers use a set of logical steps to create any new number it is theoretically possible to predict it. Nonetheless programmers take advantage of features for creating random numbers all the time, the most widely used programming languages, provide libraries that can generate 'random' values. Any role-playing game for example, that you might use, needs random values to determine the number of gold you'll find from some locations, what items dead enemies will drop, etc. So how is it possible that even though the very nature of computers makes it impossible to exhibit random behaviour, that many programs include simulations of randomness?