Chapter 2
■
the BasiCs
39
If You’re Curious …
If you want to know more about Turing machines and the basics of computation, you might like
The Annotated
Turing by Charles Petzold. It’s structured as an annotated version of Turing’s original paper, but most of the contents
are Petzold’s explanations of the main concepts, with lots of examples. It’s a great introduction to the topic. For an
fundamental textbook on computation, you could take a look at
Elements of the Theory of Computation by Lewis
and Papadimitriou. For an easy-to-read, wide-ranging popular introduction to the basic concepts of algorithmics,
I recommend
Algorithmic Adventures: From Knowledge to Magic by Juraj Hromkovicˇ. For more specifics on asymptotic
analysis, a solid textbook, such as one of those discussed in Chapter 1, would probably be a good idea.
The book
by Cormen et al. is considered a good reference work for this sort of thing. You can certainly also find a lot of good
information online, such as in Wikipedia,
21
but you should double-check the information before relying on it for
anything important, of course. If you want some historical background, you could read Donald Knuth’s paper
“Big Omicron and big Omega and big Theta,” from 1976.
For some specifics on the perils and practices of algorithmic experiments, there are several good papers, such
as “Towards a discipline of experimental algorithmics,” “On comparing classifiers,” “Don’t compare averages,” “How
not
to lie with statistics,” “Presenting data from experiments in algorithmics,” “Visual presentation of data by means of
box plots,” and “Using finite experiments to study asymptotic performance” (details in the “References” section).
For visualizing data, take a look at
Beginning Python Visualization by Shai Vaingast.
There are many textbooks on graph theory—some are rather technical and advanced (such as those by
Bang-Jensen and Gutin, Bondy and Murty, or Diestel, for example), and some are quite readable, even for the novice
mathematician (such as the one by West). There are even specialized books on, say, types of graphs (Brandstädt et al.,
1999) or graph representations (Spinrad, 2003). If this is a topic that interests you, you shouldn’t
have any trouble
finding lots of material, either in books or online. For more on best practices when using floating-point numbers, take
a look at Foreman S. Acton’s
Real Computing Made Real: Preventing Errors in Scientific Engineering Calculations.
Exercises
2-1. When constructing a multidimensional array using Python lists, you need to use for
loops
(or something equivalent, such as list comprehension). Why would it be problematic to create a
10×10 array with the expression [[0]*10]*10?
2-2. Assume perhaps a bit unrealistically that allocating a block of memory takes constant time,
as long as you leave it uninitialized (that is, it contains whatever arbitrary “junk” was left there the last
time it was used). You want an array of
n integers, and you want to keep track of whether each entry
is unitialized or whether it contains a number you put there. This is a check you want to be able to do
in constant time for any entry. How would you do this with only constant time for initialization? And
how could you use this to initialize an empty adjacency array in constant time,
thereby avoiding an
otherwise obligatory quadratic minimum running time?
2-3. Show that
O and
W
are inverses of one another; that is, if
f is
O(
g), then
g is
W(
f),
and vice versa.
2-4. Logarithms can have different bases, but algorists don’t usually care. To see why, consider the
equation log
Do'stlaringiz bilan baham: