In layman's terms, a stochastic process has the Markov property if "the future depends only on the present, not on the past"; that is, if the probability distribution of future states of the process depends only upon the current state, and conditionally independent of the past states (the path of the process) given the present state. A process with the Markov property is usually called a Markov process, and may be described as Markovian.
Mathematically, the Markov property states that, if X(t), t > 0, is a stochastic process, then
Markov processes are typically termed (time-) homogeneous if
and otherwise are termed (time-) inhomogeneous. Homogeneous Markov processes, usually being simpler than inhomogeneous ones, form the most important class of Markov processes.
In some cases, apparently non-Markovian processes may still have Markovian representations, constructed by expanding the concept of the 'current' and 'future' states to each include the states over an interval of times, for example
An example of an non-Markovian process with a Markovian representation is a moving average time series.
The most famous Markov processes are Markov chains, but many other processes, including Brownian motion, are Markovian.
Friday, September 24, 2004
Wednesday, September 08, 2004
Exponetially Quantum
It's really more confusing than you think.
It explained the superpostion of quantum states in classical computer terms of numbers in registers--every qubit added exponentially increases the number of registers the algorithm operates on.
Actually, every qubit only adds a qubit, but the size of the classical registers that would be needed to simulate the quantum register grows exponentially. This is not the same as saying the quantum register is equivalent to the huge classical register that would be needed to simulate it. The huge classical register would actually be much more useful if it could be made.
The calculation runs and ends in a group of results. Only one result of all those calulations will be right.
You are referring to a peculiarity of the Shor factoring algorithm, which is still the only useful truly quantum algorithm known. As I said, ideally a quantum algorithm would give only a single answer, i.e. at the end of the calculation the quantum state of the register would be one particular answer with unit probability and all other answers would have zero probability. However, the Shor algorithm does not do this; there is some probability of a wrong answer, and you have to run the calculation a few times to see what answers you get and then check them to see which one is right. This is not necessarily a universal characteristic of all quantum algorithms, but it's hard to say because so far the Shor algorithm is just about all we have.
Any problem that grows exponentially harder to solve as you add terms, non-polynomial problems, are tractable to a quantum computer because you simply tack on another qubit for each new term added and then run the calculation in polynomial time.
This was an early hope for QC but it is now known not to be true. Actually, there is only one algorithm known that can take advantage of the power of QC to solve what is believed to be an exponentially hard problem, and other exponentially hard problems are known not to be soluble in linear time by QC.
There is another very important exponentially hard problem which is known to be soluble by QC, although little work has been done on algorithms for it. That is the simulation of quantum systems themselves. In the long run, this may be the most important application of quantum computing, if it ever becomes practical.
--BY MarkGubrud (gubrud@squid.umd.edu)
It explained the superpostion of quantum states in classical computer terms of numbers in registers--every qubit added exponentially increases the number of registers the algorithm operates on.
Actually, every qubit only adds a qubit, but the size of the classical registers that would be needed to simulate the quantum register grows exponentially. This is not the same as saying the quantum register is equivalent to the huge classical register that would be needed to simulate it. The huge classical register would actually be much more useful if it could be made.
The calculation runs and ends in a group of results. Only one result of all those calulations will be right.
You are referring to a peculiarity of the Shor factoring algorithm, which is still the only useful truly quantum algorithm known. As I said, ideally a quantum algorithm would give only a single answer, i.e. at the end of the calculation the quantum state of the register would be one particular answer with unit probability and all other answers would have zero probability. However, the Shor algorithm does not do this; there is some probability of a wrong answer, and you have to run the calculation a few times to see what answers you get and then check them to see which one is right. This is not necessarily a universal characteristic of all quantum algorithms, but it's hard to say because so far the Shor algorithm is just about all we have.
Any problem that grows exponentially harder to solve as you add terms, non-polynomial problems, are tractable to a quantum computer because you simply tack on another qubit for each new term added and then run the calculation in polynomial time.
This was an early hope for QC but it is now known not to be true. Actually, there is only one algorithm known that can take advantage of the power of QC to solve what is believed to be an exponentially hard problem, and other exponentially hard problems are known not to be soluble in linear time by QC.
There is another very important exponentially hard problem which is known to be soluble by QC, although little work has been done on algorithms for it. That is the simulation of quantum systems themselves. In the long run, this may be the most important application of quantum computing, if it ever becomes practical.
--BY MarkGubrud (gubrud@squid.umd.edu)
Tuesday, September 07, 2004
Quantumized Brainy
For those who have been seriously inspired or irritated by Roger Penrose’s hypotheses on the possible basis of consciousness in quantum effects occurring inside neurons in the brain, a trio of researchers has published a speculative proposal that suggests that biological microtubules may act as quantum electrodynamic cavities and have the potential for quantum entanglement, teleportation and computation. The authors suggest that this mechanism may be responsible for how the brain works, or might at least provide biological building blocks for creating quantum computers. A preprint of their research paper is available online on the arXiv preprint server at http://www.arxiv.org/abs/quant-ph/0204021
Wednesday, September 01, 2004
Quantum Computing Gets Five Photons Closer
Quantum computers, which use attributes of quantum particles like atoms and photons to represent data, promise to solve certain very large problems many orders of magnitude faster than is possible using today's computers. The challenge is being able to manipulate particles well enough to carry out computing.
A key step is being able to entangle five particles, which would make it possible to check computations for errors and teleport quantum information within and between computers.
Researchers from the University of Science and Technology of China, the University of Innsbruck in Austria, and the University of Heidelberg in Germany have entangled five photons.
Error correction uses mathematical codes to detect when a bit has been accidentally flipped, and is widely used in classical computing because electronic and magnetic bits occasionally switch accidentally from a 1 to a 0 or vice versa. Quantum bits are more delicate and require an error correction method to be feasible.
The researchers used the five-photon entanglement process to carry out open-destination teleportation, which makes it possible to transmit information to any one of several processors within a quantum computer or nodes in a quantum network. Quantum teleportation is akin to faxing a document and in the process destroying the original.
It will be more than a decade before the technology is practical, according to the researchers. The work appeared in the July 1, 2004 issue of Nature.
Technology Research News (MIT FEED)
A key step is being able to entangle five particles, which would make it possible to check computations for errors and teleport quantum information within and between computers.
Researchers from the University of Science and Technology of China, the University of Innsbruck in Austria, and the University of Heidelberg in Germany have entangled five photons.
Error correction uses mathematical codes to detect when a bit has been accidentally flipped, and is widely used in classical computing because electronic and magnetic bits occasionally switch accidentally from a 1 to a 0 or vice versa. Quantum bits are more delicate and require an error correction method to be feasible.
The researchers used the five-photon entanglement process to carry out open-destination teleportation, which makes it possible to transmit information to any one of several processors within a quantum computer or nodes in a quantum network. Quantum teleportation is akin to faxing a document and in the process destroying the original.
It will be more than a decade before the technology is practical, according to the researchers. The work appeared in the July 1, 2004 issue of Nature.
Technology Research News (MIT FEED)
Subscribe to:
Posts (Atom)