Advertisement
Guest User

Untitled

a guest
Oct 20th, 2018
85
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 7.26 KB | None | 0 0
  1. # Week 1
  2. So problems are orgnaised into complexity, polynomial time is O(n^k), exponential time is O(e^n), and linear is O(n).
  3.  
  4. But many 100's of algorithms for Quantum Computers have been found that would allow one to decrease the complexity significantly, it wouldn't be quite O(e^n) to O(n^k), but more like n^4 to n^2, or sometimes more.
  5.  
  6. These would make Quantum Computers very useful, as they can do very hard problems in shorter amounts of time, so there is a want for them.
  7.  
  8. ### Moore's Law
  9. Not realy a law, more like a trend which Gordon Moore articulated in 1965, in which the number of transistors in a chip doubles every 2-3 years.
  10.  
  11. It's more of an economic 'law' - given the current fabrication technology, where is the price per transistor minimised.
  12. But it is coming to an end.
  13. Transistors today have a countable number of atoms, they are made of silicon and are generally 10-15 nanometres accross.
  14.  
  15. A typical bit of a CPU will have a silicon channel, and a gate, which will be kind of placed in the channel. This will control the flow of electrons through the semiconductor silicon channel.
  16. When a certain voltage is applied to the gate, electrons cannot flow through it, other voltages would allow electrons to flow.
  17.  
  18. A transistor requires the introduction of impurities intentionally, a <b>dopant</b>, and the process is called doping, to be made. It allows for influencing the electrical properties of the semiconductor, and make it suitable to be a transistor, which can make it easier or harder for electron to flow.
  19.  
  20. Once the number of atoms in a transistor drops below a certain level, the dopants will not be able to influence the electrons properly, and that mat be a major factor to the end of Moore's Law, as there will eventually be an ultimate physical limit to how small the transistors can get.
  21. Trade-offs can be made, more transistors can be added by adding transistors one atop another, but the heat produced would become unmanegable.
  22.  
  23. Instead scientists are working on numerous 'post siliconne' technologies, including photonics, and Quantum machines.
  24.  
  25. Furthermore development on Quantum Computers also directly contributes to concepts such as reversible computing.
  26.  
  27. <div class='blockText'>
  28. <span class='highlight'>Reversible Computing</span><br>
  29. Reversible computing is basically the concept that you can retrieve the inputs of a calculation from its outputs, by adding some extra component to the output.
  30. An irreverisble operation is one such as addition, as you cannot infer what the input was from the output
  31. However by outputting both the addition and subtraction of the input, makes it possible to uniquely identify what the inputs were (if there were 2 components), and that is a reversible operation.
  32.  
  33. Reversible operations are not affected by Landauer's Principle, that is that there is an absolute minimum amount of energy required for an irreverisble computation, but this does not apply to irreverisble operations, so they are useful, as they can surpass the minimum energy required with no obstruction, leading to more and more efficient machines.
  34.  
  35. However they have to be reversible right up to the algorithmic level, so they would be incredibly hard to implement if they can even exist, and they would likely have limited use cases, as they would likely not be general purpose machines, or we make new reversible programming languages, and make everything around reversible concepts.
  36.  
  37. And as such this would mean to make reversible programs, lots of human insight would be needed, rather than some sort of conversion program, so would be hard to develop all of these programs, but it would not be impossible, if the machine isn't impossible, as reversible programming languages do exist.
  38.  
  39. And eventually, it could cut down energy wasted by computers.... to zero.
  40. </div>
  41.  
  42. ### Factoring large numbers and Quantum Machines
  43. Encryption consists of three main stages - Authentication (normally using public key cryptography, RSA for example), Key exchange (using the Deffie-Hellman key exchange for example), and then bulk data encryption using the keys.
  44.  
  45. RSA (created by Rivest, Shamir and Adelman), is a form of public key cryptography, where a private key is generated which can decrypt data, a public key is generated which can encrypt data, and then the data has mathematical functions applied to it using the public key to encrypt, and then eventually decrypt it.
  46.  
  47. The Deffie-Hellman key exchange is an algorithm used to establish a shared secret between two parties. It is primarily used as a method of exchanging cryptography keys for use in symmetric encryption algorithms, look it up for more information, as it's pretty involved.
  48.  
  49. Now those two previous steps both use large numbers, and if it became easier to factor those large numbers, it would be easy for people to eavesdrop on your internet based communication, even with banks and stuff, which is a huge potential security risk, but in other fields it could be of great social and economic interest.
  50.  
  51. In the 1990's Shor's Algorithm to factor huge numbers in O(n^3) (polynomial time), rather than O(sqrt(10^n)), for quantum machines, which led to a huge influx of people and money into the field.
  52.  
  53. ## Algorithms and games
  54. Chess has 10^120 possible moves, Shogi has 10^220, and Go with 10^360, which are all huge amounts of moves.
  55.  
  56. The most straighforward method to figure out how to win is to use the <b>Brute force approach</b>, that is, to try every possible combination of pieces in the game, and then know the ideal move for each state of the board.
  57.  
  58. In board games specifically, after taking a move, there are certain number of ways in which the game could branch off, and the amount of moves you have to consider at each step is called the branching factor, and each step is called a ply. If there are <i>n</i> and there are <i>k</i> branching factors, that means there are n^k possibilities you need to investigate
  59.  
  60. But this is still too large a number to compute for even supercomputers, and as such humans and the computers would rely on heuristics, that is, relying on 'good' moves, and 'good' options, for example, it can be considered good to capture the opponents piece, so the computer can be made to focus and spend time on those branches more than others, though this may not lead to an optimal solution, and this relies on the problem to be structured, as in giving some sort of evidence that there is a way to guide your program towards a good if not optimal solution.
  61.  
  62. <b>Grover's Algorithm</b> is a quantum algorithm in the case where there is no structure to guide the program. If there is no way to discern whether a candidate at a certain ply is a good or bad candidate, it can be very effective to use it, though the advantage of Grover's algorithm isn't as great as Shor's algorithm, due to different use cases, as it reduces the number of searches to the square root of the original number, so for chess that is 10^120, down to 10^60, so one must be careful when picking problems to use Grover's algorithm on.
  63.  
  64. The algorithm, or more correctly, <i>algorithmic framework</i> can be applied to basically any search problem, where we are looking for a path to a particular result, so small data problems with high branching factors, like the aforementioned board games, are good candidates to use the algorithm.
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement