What's the most simple circuit to be considered a computer?

For Adafruit customers who seek help with microcontrollers

Moderators: adafruit_support_bill, adafruit

Please be positive and constructive with your questions and comments.
Locked
User avatar
ishback
 
Posts: 33
Joined: Tue Mar 04, 2014 9:50 pm

What's the most simple circuit to be considered a computer?

Post by ishback »

Hi,

I need to build the minimum expression of a computer for a personal project. A computer that has the minimum to be a computer, it's as small as it can be, and it's energy efficient.

If a computer = memory + processor, conceptually I was thinking about a circuit that just counts: a timer (555?), a register, and an adder.
For the purpose of this project, I'd like to the counts to be closer to seconds, or even days - I was thinking about using dividers so the counting is slower at it's output (which doesn't need to be output on a screen or anything, just stay with the computer). Ideally the counter should add up without overflowing until the battery dies.

My questions are:
- Is the approach to build a very simple computer that just counts correct?
- What components would you recommend to build this circuit to be as much energy efficient as possible? It will be powered with a button cell.
- Is there any tutorial I can follow to build this circuit?

Thanks in advance.

User avatar
adafruit_support_mike
 
Posts: 67485
Joined: Thu Feb 11, 2010 2:51 pm

Re: What's the most simple circuit to be considered a comput

Post by adafruit_support_mike »

That's a philosophically deep question.. the best official answer is the Church-Turing Hypothesis, which basically says, "a computer is a device that can be programmed to do what any other computer can do."

Theoretically, all computation can be reduced to finding and replacing groups of symbols. Deciding how to do that is itself a philosophically deep question, but there are three main classes of problems that fit the Church-Turing criteria.

First is the recognition of a simple pattern, like dividing all the text in this post into 'word' and 'whitespace' categories. We describe the patterns using notation called 'regular expressions', these days written exclusively in Extended Backus-Naur form: symbols are written between quotes, and there are additional symbols that mean "any symbol", "this one or that one", "zero or more of something" and "one or more of something". The EBNF for words and symbols would be something like:

Code: Select all

char = [ "a".."z" ] | [ "A".."Z" ] | [ "0".."9" ]
space = " " | "\t" | "\n"
word = char+
whitespace = space+
Regular expressions can be turned into code or hardware by what are called 'state machines'. State machines can recognize any pattern expressible in EBNF without requiring any additional memory. The process of moving from one state to another contains all the information you need to know about the pattern read up to that point.

Each line of EBNF like "word = char+" is called a 'production', and each production can be reduced to a state machine. The overall set of productions is called a 'language', and EBNF basically describes a large state machine made from smaller state machines. Thing is, any such multi-layer state machine can be reduced to a single-layer state machine with a more complicated pattern of states and connections.

The next most complicated job is identifying patterns like nested parentheses. It's easy to write the regular expression for "()", but impossible to write every variation of "(()())" in a way that a state machine can recognize.. you keep starting the same pattern over and over before ending the previous one, so 'the part of the pattern I've read so far' isn't powerful enough to hold the information you need.

Those patterns are described by 'context-free grammars', and implemented in code/hardware with stack machines. A stack machine contains a state machine, but also has an external memory where it can store partial patterns. The machine can only see the most recently stored item though, so the external memory behaves like a stack (in the compSci definition of that term). Along with the states and transitions of a regular state machine, a stack machine has additional rules for "push what you have onto the stack and start over" and "pull what's on the top of the stack off and combine it with what you have now".

You can build machines with multiple stacks, but just as with state machines, any multi-stack machine can be reduced to a single-stack machine with a more complex rule set.

The most complicated job is (believe it or not) counting more than one thing at a time. A state machine can identify the pattern "a|b", and a stack machine can identify the grammar:

Code: Select all

expr = [ "a" | "b" ] expr*
(meaning any string composed of 'a's and 'b's). But neither one can tell you if the string contains the same number of 'a's and 'b's.

For that job, you need a full-scale Turing machine, which basically has the power to decide what part of its memory to read and write. The hardware version is sometimes called a 'register machine', with the registers being individually-accessible memory locations.

A full Turing machine has infinite memory, and all physical computers are limited to a finite amount, so techically they're "N-bounded Turing machines", where N is the number of memory locations they have.

Just as with the other machines, all Turing machines are equivalent, so any device capable of solving the "same number of 'a's and 'b's" problem can solve any other problem that requires a Turing machine.

One of the simplest hardware implementations of a Turing machine consists of two counters named 'L' and 'R' capable of storing infinitely large numbers (or N-bit numbers, for an N-bounded machine). The only operations you technically need are "add 1" and "subtract 1" for both L and R.

From those, you can build rules that have the effect of adding or dividing by 2, which will shift all the bits in the counter left or right by one bit. Once you have those, you can move all the bits in L to R and vice versa, like moving a read/write head along a tape.

That mechanism is enough to make a Turing machine. All you need is a rule set (implemented as a stack machine) capable of reading the bits as patterns, and deciding whether to copy bits from one counter to the other unchanged, or to change their value as the "read/write head" passes that position on the "tape".

The rule set will be complicated, and the machine will be slow as molasses, but given enough time and the right program, that two-counter device can run a simulation of an entire Google datacenter.

In practical terms, most Turing-complete machines these days have a math-and-logic unit, a block of program memory, and an array of data storage (a structure called the 'Harvard architecture'). The functions the math-and-logic unit can do are massively redundant in terms of pure Turing-completeness (you can duplicate any one of them with suitable combinations of others), but they save a *lot* of bit-twiddling. The system to read and write randomly-selected locations in memory is the part that's most important in the Turing-completeness sense.

User avatar
ishback
 
Posts: 33
Joined: Tue Mar 04, 2014 9:50 pm

Re: What's the most simple circuit to be considered a comput

Post by ishback »

Hey Mike - thanks for your reply, that's an exhaustive explanation on the philosophical side! I really appreciate it. Such approach would be too complicated for what I'm trying to build though. I guess my next question would be:

- Wouldn't the simple counter I suggested be a computer at all?

An regardless of the answer, which components would you use to build such counter?

User avatar
adafruit_support_mike
 
Posts: 67485
Joined: Thu Feb 11, 2010 2:51 pm

Re: What's the most simple circuit to be considered a comput

Post by adafruit_support_mike »

ishback wrote:Wouldn't the simple counter I suggested be a computer at all?
It would be a state machine.. not Turing-complete, but more computer-ish than something like a voltage divider.
ishback wrote:An regardless of the answer, which components would you use to build such counter?
A 555 configured as a pulse generator would be small and simple, though I'd suggest using one of the low-power CMOS variants if you want a battery-powered circuit that lasts a long time.

For the counter, I'd use a chain of CD4020s. Each one has 14 bits, so two of them will be able to count seconds for about 8-1/2 years.

https://octopart.com/search?q=lmc555cmx
https://octopart.com/search?q=cd4020

User avatar
ishback
 
Posts: 33
Joined: Tue Mar 04, 2014 9:50 pm

Re: What's the most simple circuit to be considered a comput

Post by ishback »

Thanks that's really helpful, I'll start from there!

User avatar
sveinsen
 
Posts: 20
Joined: Sat Apr 12, 2014 7:31 am

Re: What's the most simple circuit to be considered a comput

Post by sveinsen »

You should check out From NAND to Tetris and the companion book The Elements of Computing Systems. The first 6 chapters about the hardware are available online for free at http://nand2tetris.org/chapters/. The first 6 chapters go all the way from logic gates and flip-flops to a complete 16-bit computer architecture. The next chapters teach you how to build a Java-like virtual stack machine with a compiler for a Java-like language. Finally, you get to build an OS with screen and keyboard support. I haven't read the entire book yet, but I can say that so far, it has been really beginner friendly. It also supplies some software to help you build it. It has a hardware simulator so you can check if your circuit works before you decide to build it, if you decide to build it. It also comes with solutions to all the lessons, so you can skip a chapter or "building block" that doesn't interest you, and still be able to build a complete computer.

Locked
Please be positive and constructive with your questions and comments.

Return to “Microcontrollers”