Making a computer Turing complete
The 8-bit breadboard computer is certainly limited. But is it capable enough to even be a computer? In this video we explore how Turing Machines and the Lambda Calculus defined the whole class of “computable problems.” And we talk about the relatively minor change needed to make the 8-bit breadboard computer Turing complete.
More 8-bit computer:
Support me on Patreon:
——————
Social media:
Website:
Twitter:
Patreon:
Reddit:
Source: https://blogema.org
Read more all post Computer Technology : https://blogema.org/computer/
"Adding new instructions for each new function that we think up…" we would end up with the x86 architecture.
I guess computing anything computable should necessarily mean learning than just memorizing. Being able to commit to memory the result of certain actions and then using that learning later in a different scenario. So it must be a brain.
You don't *want as many as x86
Can you make a copy of the "test/world" attach a camera in world one to world two with a network token then use world twos image in the client end? Simular to loading am online player in my game he can most around he does have an image
Everybody gangsta till the computer starts mixing the peas with the carrots.
thank you sir!
MY GOD MAN! You just made me realize the Architecture im working on… it can expand its own instruction set. Its like a Virtualized Wetware… but not virtual in the common way of thinking. Meaning no sacrifice in speed to do this…. i mean it by hardware design doesn't have an ALU (at all) in the physical components level, its all done by code. Maybe just maybe one way to think of my architecture is one large mass of control unit s, but still not exactly. Its not Harvard or Von Newman.
Thank you for turkish subtitless
the universal computing machine = the first virtual machine?
42
Comparing a desktop computer to a breadboard computer doesn't make sense…..
What about a comparison interms of microcontroller such as Arduino and Attiny and how we can use that to run our own custom hardware ?.
this infinite tapes reminds me of DNA
3:37 Philosophers : Am I a joke to you?
I really like how this computer/giant board looks, It's almost like a little city with different buildings and many streets…..It's really cool!
hi i have wath you vidios for 1 week now and i licke it so i'm subin now 🙂
hop you come whit mor stuff
The conditions can be simply flags from the ALU
Ben eater + Albert malvino= computer
3:36 "We don't really expect a computer to be able to compute the meaning of life? right?"
said the man who built a computer that outputs 42 every time
Your Turing tape example sounds an awful lot like brainf*ck (the language, I'm not being vulgar, for once).
It's funny that a computer with 2TB of RAM is still not Turing complete.
please make videos on basys3 board
Hold on now…. When Ben was programming the microcode, he said everything is up to us bcuz it's our computer , we're the ones building it….
So screw Greek calculus! I decide that my computer is already Turing Complete.?
Must Read: Godel-Escher-Bach
I know it's 2 years ago but I would say that even with that jump the computer is not turing complete since a turing computer can "program itself" since it can write the next instructions.
your computer can't since the order of instructions are written in ROM
Technically it would be possible to make a Turing machine even without comparisons (assuming infinite memory though). That is, you place the code for all the possible cases in memory. Then, you take the addresses of the codepaths. In general, the address for the codepath (if result is x) should be placed at address x. Then you do your subtraction. You can then simply jmp at the address that has the value of the result, fetch the contents and jmp again. :)) Obviously, if you want to test for something like "less than" you would need to fill all the memory addresses that are a possible result of a subtraction, which is why we need to assume infinite memory (and obviously an infinite address bus to access it).
Entscheidungsproblem is pronounced somewhat like Ent-Shy-Dongs Problem
3:36, ahem, the answer to life the universe and everything has already been computed by Deep Thought and that answer is…. 42
One of the things that bugs me when people talk about the early years of computers is they always forget about Tommy Flowers who built Colossus, the world's first programmable electronic computer… Everyone just bangs on about Turing…
hah, compute the meaning of life 😀 = 1
Thank you! I only wish that our education system would take note of your skills and approach. Respect Ben!
3:36 42
Thought you guys might enjoy this:
https://www.youtube.com/watch?v=hTJi7Ct6MfY
@14:51 The statement made here is incorrect.
It is complex and messy to write a program that rewrites itself, but it is possible, and this machine is capable of doing that.
Compute a number from 96 to 111 (0b01100000 to 0b01101111) using addition or subtraction (based on data read from somewhere else) and store that at an address that you know will be coming up soon in the program counter.
That is doing something different based on the result of data read from memory.
a multiplication with these instructions is possible, it all depends on dynamically overwriting the address following the JMP instruction. you can just jump into a series of "ADD" instructions. You just have to calculate the entry point correctly and overwrite the address (self-modifying code, writeable text-segment) following the JMP opcode. Here's a C-code that multiplies two 4 bit numbers. With the exact knowledge about your CPU's machine code (which I don't have, but I'm sure it's explained in your previous videos), you can easily generate the correct machine code. The "trick" is that you don't "break" after each "case", but just "fallthrough" to the next case statement.
int imult4(int i,int j) {
int n=0;
switch(i) {
case 15: n+=j;
case 14: n+=j;
case 13: n+=j;
case 12: n+=j;
case 11: n+=j;
case 10: n+=j;
case 9: n+=j;
case 8: n+=j;
case 7: n+=j;
case 6: n+=j;
case 5: n+=j;
case 4: n+=j;
case 3: n+=j;
case 2: n+=j;
case 1: n+=j;
case 0: /* nothing */ ;
}
return n;
}
Unfortunately, I don't exactly know how the design of the machine code. I assume that LDI is a 'load immediate', STA / LDA / ADD operarte on memory addresses. An assembly code doing a multiplication without a MUL instruction could look like this:
; assumption:
; i and j are memory locations,
; they contain the numbers
; to be multiplied
imult4:
;
; calculate entry point:
;
ldi @_mul0 ; load immediate
sub i ; calculate the offset backwards
sub i ; accu = addr(mul0) – 2 x i
sta _jmp+1 ; overwrite jmp addr with entry to add table
ldi 0 ; sets accu to 0
_jmp: jmp 0x0000 ; 3 bytes; JMP, LSB, MSB (yes?)
;
; an add takes two bytes
;
add j ; x 15
add j ; x 14
add j ; x 13
add j
add j
add j ; x 10
add j
add j
add j
add j
add j ; x 5
add j
add j
add j
add j ; x 1
_mul0: out ; x 0 and output the accu
hlt
for an 8bit multiplication, the list of ADD instructions is times longer (255 instead of 15 ADDs)
do you see any issues with this approach?
I really liked how you integrated the math theory with the hardware and instruction set design here. Not many tutors could have done that so elegantly
3:37 laughs in Hitchhiker’s Guide
i wish i could have gotten mine to work
This circuit look like a factory in factorio
man its cool you test your computer for turing hardness, barely anyone does that anymore in the hardware scene. I think its the entire dogma of if A is turing complete then another product that uses A -> A+B there is no need to test for turing completeness since A being turing complete suffices to say B is also turing complete. Tensorflow is a great example of this…..
I am convinced your IQ level is infinity.
Maybe you can use your infinitely long tape that have right there for memory in your breadboard computer and create a real Turing Machine!
Mac user
@Ben Eater Hello Ben. You have made a really great work. Please Do not stop! Look into object oriented programming! And paradigms – there is enoghf to have current amount of basic types to describe any object in world! Yes you can choose different arch for that! But you doing that to explain to other people the logic! And you doing that in great way. Thank you!
Is this thing a computer?
If it runs Crysis it is a computer :')
xD 16 BG is not further away from infinity than 16 byte is … well xD cought me on that ^^ didnt expect that xD but its true xD
Wow, very well explained! I just understood what "Turing completeness" means. I mean intuitively, really, it just all clicked in my head. Thank you!
This is how Skynet started and became self aware. "Anything it can learn" should be on your list too Ben. Really We need to be able to perform basic logic of those registers and store the result and GPIO too for me of course.
I would suggest looking into a language called brainf**k, which is literally designed specifically to run on a turing machine. It only has 8 instructions, is turing complete, and doesn't even have a jump function. It has < and >, which mean to look at the next or previous memory positions (Same as checking specific memory address), + and – which add or subtract 1 from the current memory position (equivalent to 'i = i + 1' and 'i = i – 1'), '.' which is the same as your OUT command, ',' which takes user input (Not actually in your computer but not actually needed), and the final two command "[" and "]", which are the only logic of the language, meaning "Loop whatever is between these two symbols until the current memory position we are looking at does not equal zero". That is turing complete. Infact, there are compilers from C# or java to brainf**k, and people have written everything all the way to a functional version of "The game of life" on it.
For example: "[-]" means to subtract 1 from the current memory position until it equals zero. AKA, set the current memory position to 0.
E.g. 2. "[-]++++>[-]++++++++" means to set the current memory position to the number 4, then move over to the next memory position and set it to 8.
If the inability to rigorously define "what is computable" doesn't bother you then you are neither an engineer nor a scientist, but merely a devotee of computer science..The ability to alter the path of instruction execution based on the state of a flag or other variable is about all that makes a machine a computer. And this is all Touring's machine does. So that became the standard for what is computable. I wouldn't expect a machine to be able to simply be given an objective and figure out for itself how to accomplish it. Let's face it, that is what people are for. And that is what people do. But if something like that were someday possible in a "non human" or non sentient or . . . device, then wouldn't you have to redefine computability? Or would you simply have to exclude such a device from the domain of computers? In the end it's not science – its technology.
I have a hack to the original architecture that allows for multiplication to be output without conditionals by exploiting the lower nibble of the OUT command.
Assume we want to multiply 5 and 4. The following program can be used:
0: LDA 5
1: SUB 15
2: STA 5
3: LDI 0
4: ADD 14
5: HLT
6: STA 14
7: LDA 5
8: ADD 1
9: STA 5
10: LDA 14
11: JMP 4
12: NOP
13: NOP
14: 5 ;first operand
15: 4 ;second operand
By subtracting the second operand from the opcode of HLT (11110000) you get OUT and some garbage as an ignored parameter, but by counting back up, it takes exactly X steps (where X is the second operator) to reach the HLT opcode again, having the final output be the result of multiplying the two operands.
Follow me on twitter plz: @BtheDestroyerMC