Xandor Schiefer
07/19/2020, 7:40 AMStefan
07/19/2020, 10:38 AMStefan
07/19/2020, 10:41 AMopeispo
07/19/2020, 11:05 AMIan Rumac
07/19/2020, 11:52 AMIonuČ G. Stan
07/19/2020, 12:43 PM[...] it doesnāt have an understanding of anything itās writing.That's an exceptionally high bar to set. We have no clue where to start from in building an artificial consciousness. Alan Turing attempted this and while his result was not artificial consciousness (as far as we know :P), he still gave us computers.
Stefan
07/19/2020, 1:54 PMStefan
07/19/2020, 1:55 PMIonuČ G. Stan
07/19/2020, 2:27 PMIonuČ G. Stan
07/19/2020, 2:29 PMJimmy Miller
IonuČ G. Stan
07/19/2020, 2:37 PMStefan
07/19/2020, 4:17 PMJimmy Miller
Ivan Reese
it doesnāt have an understanding of anything itās writing.I would say: Nobody should expect it to have anything that could be described as understanding. It doesn't need to have understanding to be interesting. /2Ā¢
Ivan Reese
Scott Anderson
07/19/2020, 7:55 PMScott Anderson
07/19/2020, 8:03 PMScott Anderson
07/19/2020, 8:06 PMScott Anderson
07/19/2020, 8:07 PMScott Anderson
07/19/2020, 8:12 PMIan Rumac
07/19/2020, 8:53 PMStefan
07/19/2020, 9:34 PMJimmy Miller
Can somebody explain to me what āunderstandingā means? So machines are basically just trained on patterns⦠how exactly are humans differentā¦?
This is the question I was originally trying to address. Are humans any different than machines? Can we just continue to make better models and achieve understanding or consciousness or intelligence? Studying the brain doesn't give us answers to those questions. I am glad to hear that you are interested in it. It is probably just a bias that I've picked up where many people are against philosophy.
there are lots of technologists who do not look at philosophy or cognitive science (and therefore likely base their thinking probably on outdated and/or folk theories) and still have strong opinions on how close GPT-3 is to AGI or not.
I guess what I was trying to get at is I don't think those views are grounded. The question of even determining if something we created has true understanding, is actually intelligent, or has consciousness is a hard question. In fact, Ned Block calls that the Harder Problem of consciousness.
Ian Rumac
07/20/2020, 9:11 AMAre humans any different than machines?We are. Our āmodelā does wetware based optimisation, morphing and most importantly localisation, which is something machines cant do while in hardware mode. Yeah, you can rearrange it in memory, but memory is 2D and that means huge overhead we donāt have. You need a multidimensional graph and mocking that in 2D is annoying as hell. And we are geared towards survival and reproduction, so we have āemotionsā that reinforce our models
Can we just continue to make better models and achieve understanding or consciousness or intelligence?Not this way. OpenAI has other stuff that is more geared toward better AI and intelligence. If we just continue to make better models but keep them shallow like this, only thing we will understand that intelligence isnāt pattern recognition.
Stefan
07/20/2020, 9:58 AMIan Rumac
07/20/2020, 10:15 AMIan Rumac
07/20/2020, 11:01 AMIan Rumac
07/20/2020, 11:11 AMStefan
07/20/2020, 12:59 PMIonuČ G. Stan
07/20/2020, 1:02 PMIan Rumac
07/20/2020, 1:40 PMStefan
07/20/2020, 2:21 PMIan Rumac
07/20/2020, 2:22 PM