Subject: D simulated by H where H is a C interpreter --- Maybe Mike has become a liar ???
On 11/8/2025 6:30 PM, Mike Terry wrote:
On 08/11/2025 20:54, olcott wrote:
On 11/8/2025 1:58 PM, Kaz Kylheku wrote:
"D simulated by H" is literally not a thing. D is simulated by
a simulator, which doesn't care whether it is driven by
events from H, or elsewhere.
All correct simulations of D show halting.
Only when you dishonestly ignore that we are only
examining the case where D calls its own simulator.
What is your motive for being dishonest?
Simulations must be /complete/ to be correct.
When N steps of D are simulated by H everyone
that has enough knowledge of C knows that D
simulated by H keeps calling H(D) in recursive
simulation until H is smart enough to kill its
simulation.
Why do you insist on lying about this?
˙˙ int H(void (*p)(void), interp *s);
˙From now on, you must only discuss the above API for simulating
deciders, or any other variant of your choice in which two arguments are >>> represented: the procedure to be analyzed.˙ and a freshly instantiated
simulation pointing at that procedure.
I am going to adapt a C interpreter to do this
myself soon enough. You won't be able to get away
with your lies for very long.
This is suggesting that you are thinking that producing your C
interpreter will somehow further your argument and prove your point?
That is a total delusion - it will change nothing.
Do you remember when you said you were going to write your "directed
acyclic graph notation parser"? ˙I and others told you that there was no need to do that, because it will not prove anything,
I have proved that it does prove that the Liar Paradox
is semantically unsound and people here don't lie about
this they are simply too stupid to understand that I
am correct.
or resolve any
issue that was under dispute.˙ We were right - but you spent ages
writing it anyway, and nothing whatsoever changed!
Only people that understand what a cycle in the directed
graph of the evaluation sequence can understand that I
am correct. To everyone else those technical terms are
mumbo jumbo so they presume that I am wrong.
Do you remember when you said you were going to write your x86utm?˙ I
and others told you that there was no need to do that, because it will
not prove anything, or resolve any issue that was under dispute.˙ We
were right - but you spent ages writing it anyway, and nothing
whatsoever changed!
It does prove that N instructions simulated by H
according to the semantics of the x86 programming
language cannot possibly reach its own "ret" instruction
final halt state yet not one person besides me knows
the x86 language so the stupid people presume that
I must be wrong entirely on the basis of their
own ignorance.
There are many more steps to the proof that this
that prove the significance of this step yet the
stupid people here won't let be get past this
first step for three years.
(Well, what changed, I'd concede, is that it gave us something to argue about for a few years.˙ :) The point I'm making is that it never proved
your argument as you believed it would, and it didn't help you convince
even one single person that your claims were correct.)
It is because the stupid people assumed that their
own ignorance was proof that I mam incorrect.
Well, now you're saying you're going to go away and spend a considerable ammount of your remaining time on developing some "even more convincing x86utm-like program".
I never said that I am going away. I will still respond
to you and a few others.
I and others are telling you that there is no
point in doing that, because it will not prove anything, or resolve any issue that is currently under dispute.˙ If you proceed you will spend
ages writing it, and nothing whatsoever will change!
I think that I could finish it in a week.
The sort of language you're employing around this ["..You won't be able
to get away with your lies for very long.."] sounds /exactly/ the same
to me as your language prior to writing your DAG parser and x86utm!
Mike.
My strategy worked with Kaz.
I boxed him into a corner to make one key concession.
That is one of the reasons that I am adapting the C
interpreter. I want to make disagreeing with my key
point look ridiculously foolish even to a moron.
*This is my key foundational point*
int H(char* P);
int D()
{
int Halt_Status = H(D);
if (Halt_Status)
HERE: goto HERE;
return Halt_Status;
}
The above is in test.c
simulate.exe implements a C interpreter.
simulate test.c
runs the interpreter on the above source file
from the command prompt.
When this interpreter sees the call to H(D)
it calls itself with the text body of D.
The above has proven that N instructions simulated
by H according to the semantics of the C programming
language cannot possibly reach its own "return"
statement final halt state for three years and
almost everyone here flat out lies about this.
--
Copyright 2025 Olcott "Talent hits a target no one else can hit; Genius
hits a target no one else can see." Arthur Schopenhauer
--- PyGate Linux v1.5
* Origin: Dragon's Lair, PyGate NNTP<>Fido Gate (3:633/10)