Test Your Intuition (10): How Does “Random Noise” Look

This is a bit unusual post in the “test your intuition” corner as the problem is not entirely formal.  

How does random noise in the digital world typically look?

Suppose you have a memory of n bits, or a memory based on a larger r-letters alphabet, and suppose that a “random noise” hits the memory in such a way that the probability of each bit being affected is t.

What will be the typical behavior of such a random digital noise? Part of the question is to define “random noise” in the best way possible, and then answer it for this definition.

In particular, Will the “random noise” typically behave in a way which is close to be independent on the different bits? or will it be highly correlated? or pehaps the answer depends on the size of the alphabet and the value of t?

The source of this question is an easy fact about quantum memory which asserts that if you consider a random noise operation acting on a quantum memory with n qubits, and if the probability that every qubit is damaged is a tiny real number t, then typically the noise  has the following form: with large probability nothing happens and with tiny probability (a constant times t) a large fraction of qubits are harmed.

 I made one try for the digital (binary) question but I am not sure at all that it is the “correct” definition for what “random noise” is.

(Maybe I should try to ask the problem also on “math overflow“. See also here, here and here for what math overflow is.)

Update:  over “mathoverflow” Greg Kuperberg made an appealing argument that for the correct notion of random noise the behavior in the classical case is similar to that of the quantum case.

Test Your Intuition #9 (answer to #9),  #8  (answer),   #7,   #6,  #5,  #4 (answer), #3 (answer), #2#1.

About these ads
This entry was posted in Probability, Test your intuition and tagged , . Bookmark the permalink.

3 Responses to Test Your Intuition (10): How Does “Random Noise” Look

  1. v8r says:

    Can $t$ depend on $n$? Suppose it does not. Then we can define the Bernoulli noise. Each bit is flipped with prob. $t$. So typicalally n(1-t) bits are not affected.
    More bits are affected with smaller prob.

  2. Gil Kalai says:

    Dear v8r, certainly a Bernouli noise which acts independently on each bit is an important example. The question is: Does a typical random example of noise behaves like Bernouli noise?

    (For simplicity we can assume that t is a small constant.)

  3. proaonuiq says:

    This “Math Overflow” site is great ! The design is close to perfect. I like the votes and views statistics, the tags and users rankings etc…

    As an sporadic blog commenter i miss this kind of stuff in blogs. You have not information about your audience nor if your comments are abhored or loved (although you can imagine).

    I´m surprised by how low digital flavoured subjects (computer science science related topics) as computational complexity are ranked. The hotest topic up to now seems to be algebraic geometry. We must wait and see how the ranking “evolves”.

    Thanks for the pointer !

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s