Tuesday, March 15, 2011

DIGITAL BASIC OF COMPUTER

Early computer programmers needed the digital basics to some way represent the human recognised numbers 10 to 15 under the decimal system in a way which still represented one decade. They conveniently chose A - F the first six letters of the alphabet and six in latin is "HEX". Hex-Decimal was born, six alphabetical characters with ten decimal numbers comprising a set of sixteen unique settings of bits all told. The first home computers such as my old personal favourite, the Apple II, had an eight bit "data bus" which dealt in "bytes" and had a sixteen bit (65,536 or 64K) "address bus". The only changes since the 1970's has been the ever increasing speed of the digital logic blocks contained within microprocessors, repeated doubling of the number of switches, (er sorry bits!) reduced power consumption for efficiency, and expanded on board "instruction sets" of micro-code for sharp programmers to use. Dead simple really. By the way, computers and other digital devices can NOT multiply or divide, they can only add and subtract or shift a sequence of bits left or right. When a computer ostensibly multiplies 3 X 4 it actually deep down in the nitty gritty department of all those basic logic blocks shown in figure 3 above, which are buried deep within your IBM or Mac microprocessor, takes the number four, adds four again and; finally adds four again to get twelve. Anyone who tells you otherwise reveals a deep ignorance of digital basics, trust me.

No comments:

Post a Comment