Let’s take a Byte into Computers
<p>Have you ever wondered what is happening inside the computers when they run? I have. I imagined tiny little ants working hard, pressing buttons, and running the machine in the background of computers. But that’s far from the truth. After reading, researching, and watching more information, here’s what I learned.</p>
<p>Computers hold 1’s and 0’s along with a device that can read and write to it, called a turing machine. It can compute anything, from algorithms to graphs. According to <a href="https://www.cl.cam.ac.uk/projects/raspberrypi/tutorials/turing-machine/one.html" rel="noopener ugc nofollow" target="_blank">University of Cambridge</a>, “the (turing) machine can simulate ANY computer algorithm, no matter how complicated it is!”.</p>
<p><img alt="" src="https://miro.medium.com/v2/resize:fit:700/1*5uqdOhkBZZQYcKAldAnwOQ.png" style="height:394px; width:700px" /></p>
<p>Silicon parts in the CPU (photo taken from https://www.extremetech.com/extreme/208501-what-is-silicon-and-why-are-computer-chips-made-from-it)</p>
<p>At the core of the computers, there is a CPU (core processing unit). Within it, there are silicon parts that a contains a billion of transistors (microscopic on and off switches). The value at one of the switches is called a “bit”, which is the smallest bit of info that a computer can use. One bit does not make an impact so it typically comes in a package of 8, called a “byte”. A byte can make up to 256 combinations. A character produced from the keyboard is connected to a binary value (ASCII character encoding). For example, “A” is 01000001 and “a” is 01100001 (provided by <a href="https://www.ascii-code.com/" rel="noopener ugc nofollow" target="_blank">ASCII Table</a>). The code will then be converted into machine code, which is a binary format to be processed in the CPU.</p>
<p><a href="https://medium.com/@vmaineng/lets-take-a-byte-into-computers-cce0dd131994">Click Here</a></p>