I cannot believe that you in all seriousness state that GPT-4 has 170 trillion parameters. GPT-4 was not scaled by a factor of 1000 over GPT-3. One would also have to scale the number of training tokens for GPT-4 by more than a thousand over GPT-3, since according to the Chinchilla paper by DeepMind GPT-3 with 300 billion training tokens was severely undertrained. So if we multiply 300B by 5000 we get 1500 trillion training tokens. Where would those come from. The 170T parameter number is utter nonsense.
@ericw1156 Жыл бұрын
*Promosm*
@judithsixkiller5586 Жыл бұрын
After only 6 year's of her busy and accomplished existance , Sophia has announced that her next new level advanced version will be Project RIA , to be mass produced by Machani Robotics. Unfortunately, Sophia's great introduction video for RIA has become private. But her picture and several brief interviews by her brilliant new team are available on Machani Robotics home page.
@anthonycobetto43702 жыл бұрын
Keep going strong.
@CrossTrainedMind3 жыл бұрын
What civilian outreach is being done to better educate the public on the positive uses of AI from this group?
@aimagazine3 жыл бұрын
Hi, please read the full article here: www.aimagazine.com/interviews/michael-kanaan-usafmit-ai-accelerator
@CrossTrainedMind3 жыл бұрын
What civilian use cases are you seeing from this technology?
@CrossTrainedMind3 жыл бұрын
What explainable AI (XAI) capabilities are being used to show why the model selects what it does?
@CrossTrainedMind3 жыл бұрын
I'm curious as to what the USAF/MIT is doing about public buy in. There is so much negative hype, mostly from fiction, that people who don't understand AI can easily fear it.