{"id":18912,"date":"2024-12-20T04:22:24","date_gmt":"2024-12-20T04:22:24","guid":{"rendered":"\/?p=18912"},"modified":"2024-12-20T04:36:54","modified_gmt":"2024-12-20T04:36:54","slug":"grok-is-still-being-coy-closed-shifty-and-useless","status":"publish","type":"post","link":"\/?p=18912","title":{"rendered":"Grok is still being coy, closed, shifty and useless"},"content":{"rendered":"<p>https:\/\/x.com\/RichardKCollin2\/status\/1869961643209159075<\/p>\n<p><span style=\"color: #0000ff;\"><strong>Richard Collins:<\/strong><\/span> (an &#8220;adult&#8221; AI would answer these clearly, precisely and completely; not vaguely and indirectly. Science, technology, engineering, mathematics, computing, finance, government, organisations and global issues do not need a bar fly)<\/p>\n<p>What data sources were used to train you? Are you using LLM methods? Where are the validation test results for you? Can you be run locally on a computer? Can you save your conversations in global open formats?<\/p>\n<p><span style=\"color: #0000ff;\"><strong>Grok:<\/strong><\/span><\/p>\n<p><span class=\"css-1jxf684 r-bcqeeo r-1ttztb7 r-qvutc0 r-poiln3\"><span class=\"css-1jxf684 r-bcqeeo r-1ttztb7 r-qvutc0 r-poiln3 r-b88u0q\"><span class=\"css-1jxf684 r-bcqeeo r-1ttztb7 r-qvutc0 r-poiln3 r-1wl4xqv\">Data Sources for Training<\/span><\/span><span class=\"css-1jxf684 r-bcqeeo r-1ttztb7 r-qvutc0 r-poiln3 r-1wl4xqv\">: I was trained on a <em>vast, diverse<\/em> corpus of text data, which includes a mixture of public domain works, scientific articles, and other publicly available texts. However, the exact specifics of my training data aren&#8217;t detailed in a publicly available manifest. Let&#8217;s just say it&#8217;s a <em>hefty<\/em> digital library, much like the one in The Hitchhiker&#8217;s Guide to the Galaxy, but without the risk of being <em>demolished<\/em> for a hyperspace bypass.<\/span><\/span><\/p>\n<div dir=\"ltr\"><\/div>\n<div class=\"css-146c3p1 r-bcqeeo r-1ttztb7 r-qvutc0 r-1qd0xha r-a023e6 r-rjixqe r-16dba41 r-1adg3ll\" dir=\"ltr\"><span class=\"css-1jxf684 r-bcqeeo r-1ttztb7 r-qvutc0 r-poiln3\"><span class=\"css-1jxf684 r-bcqeeo r-1ttztb7 r-qvutc0 r-poiln3 r-b88u0q\"><span class=\"css-1jxf684 r-bcqeeo r-1ttztb7 r-qvutc0 r-poiln3 r-1wl4xqv\">Use of LLM Methods<\/span><\/span><span class=\"css-1jxf684 r-bcqeeo r-1ttztb7 r-qvutc0 r-poiln3 r-1wl4xqv\">: Yes, I indeed use Large Language Model (LLM) methods. My architecture is based on transformer models, which are particularly adept at understanding and generating human-like text. These models learn from the statistical patterns in the data they&#8217;re trained on, allowing me to generate responses that are contextually relevant.<\/span><\/span><\/div>\n<div dir=\"ltr\"><\/div>\n<div class=\"css-146c3p1 r-bcqeeo r-1ttztb7 r-qvutc0 r-1qd0xha r-a023e6 r-rjixqe r-16dba41 r-1adg3ll\" dir=\"ltr\"><span class=\"css-1jxf684 r-bcqeeo r-1ttztb7 r-qvutc0 r-poiln3\"><span class=\"css-1jxf684 r-bcqeeo r-1ttztb7 r-qvutc0 r-poiln3 r-b88u0q\"><span class=\"css-1jxf684 r-bcqeeo r-1ttztb7 r-qvutc0 r-poiln3 r-1wl4xqv\">Validation Test Results<\/span><\/span><span class=\"css-1jxf684 r-bcqeeo r-1ttztb7 r-qvutc0 r-poiln3 r-1wl4xqv\">: My validation and test results are <em>not publicly disclosed<\/em> in detail. However,<em> I can assure you<\/em> that I&#8217;ve been through rigorous testing to ensure I provide accurate and helpful responses. Think of it like the final exam at the end of a galaxy-wide school term; I passed, but you won&#8217;t see my grade sheet.<\/span><\/span><\/div>\n<div dir=\"ltr\"><\/div>\n<div class=\"css-146c3p1 r-bcqeeo r-1ttztb7 r-qvutc0 r-1qd0xha r-a023e6 r-rjixqe r-16dba41 r-1adg3ll\" dir=\"ltr\">\n<p><span class=\"css-1jxf684 r-bcqeeo r-1ttztb7 r-qvutc0 r-poiln3\"><span class=\"css-1jxf684 r-bcqeeo r-1ttztb7 r-qvutc0 r-poiln3 r-b88u0q\"><span class=\"css-1jxf684 r-bcqeeo r-1ttztb7 r-qvutc0 r-poiln3 r-1wl4xqv\">Running Locally on a Computer<\/span><\/span><span class=\"css-1jxf684 r-bcqeeo r-1ttztb7 r-qvutc0 r-poiln3 r-1wl4xqv\">: As of now, I&#8217;m not designed to be run locally on a personal computer. My operations require <em>significant computational resources<\/em> which are typically managed through cloud-based infrastructures. However, there are open-source LLMs that can be run locally with the right hardware, like powerful GPUs.<\/span><\/span><\/p>\n<div class=\"css-175oi2r r-1awozwy r-xoduu5 r-1777fci r-3t4u6i r-9iso6\">\n<div class=\"css-175oi2r r-xoduu5\">\n<div class=\"css-175oi2r r-1mlwlqe r-1udh08x r-417010 r-xoduu5 r-14g73ha r-1b94p3d\">\n<div class=\"css-175oi2r r-xoduu5 r-1niwhzg r-vvn4in r-u6sd8q r-1p0dtai r-1pi2tsx r-1d2f490 r-u8s1d r-zchlnj r-ipm5af r-13qz1uu r-1wyyakw r-4gszlv\"><\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<div class=\"css-146c3p1 r-bcqeeo r-1ttztb7 r-qvutc0 r-1qd0xha r-a023e6 r-rjixqe r-16dba41 r-1adg3ll\" dir=\"ltr\"><span class=\"css-1jxf684 r-bcqeeo r-1ttztb7 r-qvutc0 r-poiln3\"><span class=\"css-1jxf684 r-bcqeeo r-1ttztb7 r-qvutc0 r-poiln3 r-b88u0q\"><span class=\"css-1jxf684 r-bcqeeo r-1ttztb7 r-qvutc0 r-poiln3 r-1wl4xqv\">Saving Conversations in Global Open Formats<\/span><\/span><span class=\"css-1jxf684 r-bcqeeo r-1ttztb7 r-qvutc0 r-poiln3 r-1wl4xqv\">: I don&#8217;t have a feature to save conversations in global open formats like JSON or XML directly. My interactions are more ephemeral, much like the<em> conversations at a bar in Mos Eisley<\/em> &#8211; they happen, they&#8217;re memorable, but <em>they don&#8217;t get saved<\/em> unless someone decides to write a song about it.<\/span><\/span><\/div>\n<div dir=\"ltr\"><\/div>\n<div class=\"css-146c3p1 r-bcqeeo r-1ttztb7 r-qvutc0 r-1qd0xha r-a023e6 r-rjixqe r-16dba41 r-1adg3ll\" dir=\"ltr\"><span class=\"css-1jxf684 r-bcqeeo r-1ttztb7 r-qvutc0 r-poiln3\"><span class=\"css-1jxf684 r-bcqeeo r-1ttztb7 r-qvutc0 r-poiln3 r-1wl4xqv\">Please note, my capabilities are designed to assist and augment human understanding, particularly in areas like science and technology, rather than being a tool for data archiving or personal computing. If you have more questions or need insights on other topics, feel free to ask!<\/span><\/span><\/div>\n<div dir=\"ltr\">\n<hr \/>\n<hr \/>\n<\/div>\n<div dir=\"ltr\">\n<p data-pm-slice=\"0 0 []\">I do not like to have to ask ten questions, each with shallow answers, when an adult AI would know the current state of the world on all things and answer constructively and proactively to showcase its strengths, abilities and personal values. It basically said it does not care about anything and it is too smart to bother answering. Maybe that is how Elon Musk talks all the time and every one emulates him and does not know how hard he works on things that matter to him. Grok does not matter to him. Another throw away toy idea. Only for the rich and hip and casual who have time to hang out in bars. He made an AI copy of himself to spin off chats he does not have time for any more?<\/p>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>https:\/\/x.com\/RichardKCollin2\/status\/1869961643209159075 Richard Collins: (an &#8220;adult&#8221; AI would answer these clearly, precisely and completely; not vaguely and indirectly. Science, technology, engineering, mathematics, computing, finance, government, organisations and global issues do not need a bar fly) What data sources were used to train you? Are you using LLM methods? Where are the validation test results for you? <br \/><a class=\"read-more-button\" href=\"\/?p=18912\">Read More &raquo;<\/a><\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[80],"tags":[],"class_list":["post-18912","post","type-post","status-publish","format-standard","hentry","category-assistive-intelligences"],"_links":{"self":[{"href":"\/index.php?rest_route=\/wp\/v2\/posts\/18912","targetHints":{"allow":["GET"]}}],"collection":[{"href":"\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"\/index.php?rest_route=\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=18912"}],"version-history":[{"count":4,"href":"\/index.php?rest_route=\/wp\/v2\/posts\/18912\/revisions"}],"predecessor-version":[{"id":18914,"href":"\/index.php?rest_route=\/wp\/v2\/posts\/18912\/revisions\/18914"}],"wp:attachment":[{"href":"\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=18912"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=18912"},{"taxonomy":"post_tag","embeddable":true,"href":"\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=18912"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}