05-06-2023, 05:16 PM
Google Says “We Have No Moat, And Neither Does OpenAI”
<div>
<div class="kk-star-ratings kksr-auto kksr-align-left kksr-valign-top" data-payload='{"align":"left","id":"1339877","slug":"default","valign":"top","ignore":"","reference":"auto","class":"","count":"0","legendonly":"","readonly":"","score":"0","starsonly":"","best":"5","gap":"5","greet":"Rate this post","legend":"0\/5 - (0 votes)","size":"24","title":"Google Says "We Have No Moat, And Neither Does OpenAI"","width":"0","_legend":"{score}\/{best} - ({count} {votes})","font_factor":"1.25"}'>
<div class="kksr-stars">
<div class="kksr-stars-inactive">
<div class="kksr-star" data-star="1" style="padding-right: 5px">
<div class="kksr-icon" style="width: 24px; height: 24px;"></div>
</p></div>
<div class="kksr-star" data-star="2" style="padding-right: 5px">
<div class="kksr-icon" style="width: 24px; height: 24px;"></div>
</p></div>
<div class="kksr-star" data-star="3" style="padding-right: 5px">
<div class="kksr-icon" style="width: 24px; height: 24px;"></div>
</p></div>
<div class="kksr-star" data-star="4" style="padding-right: 5px">
<div class="kksr-icon" style="width: 24px; height: 24px;"></div>
</p></div>
<div class="kksr-star" data-star="5" style="padding-right: 5px">
<div class="kksr-icon" style="width: 24px; height: 24px;"></div>
</p></div>
</p></div>
<div class="kksr-stars-active" style="width: 0px;">
<div class="kksr-star" style="padding-right: 5px">
<div class="kksr-icon" style="width: 24px; height: 24px;"></div>
</p></div>
<div class="kksr-star" style="padding-right: 5px">
<div class="kksr-icon" style="width: 24px; height: 24px;"></div>
</p></div>
<div class="kksr-star" style="padding-right: 5px">
<div class="kksr-icon" style="width: 24px; height: 24px;"></div>
</p></div>
<div class="kksr-star" style="padding-right: 5px">
<div class="kksr-icon" style="width: 24px; height: 24px;"></div>
</p></div>
<div class="kksr-star" style="padding-right: 5px">
<div class="kksr-icon" style="width: 24px; height: 24px;"></div>
</p></div>
</p></div>
</div>
<div class="kksr-legend" style="font-size: 19.2px;"> <span class="kksr-muted">Rate this post</span> </div>
</p></div>
<h2 class="wp-block-heading">Key Points</h2>
<ul class="has-global-color-8-background-color has-background">
<li>The <a href="https://www.semianalysis.com/p/google-we-have-no-moat-and-neither" data-type="URL" data-id="https://www.semianalysis.com/p/google-we-have-no-moat-and-neither" target="_blank" rel="noreferrer noopener">leaked document</a> is titled <strong>“We Have No Moat, And Neither Does OpenAI.”</strong></li>
<li>It argues that <strong>open-source AI development is winning</strong> and that Google and other companies have no competitive advantage or “moat” in the field.</li>
<li>The document suggests that Google and other companies should <strong>focus on building tools and infrastructure that support open-source</strong> AI development rather than trying to compete with it.</li>
<li>The document provides a fascinating insight into the state of AI development and the challenges facing companies like Google as they try to stay ahead of the curve.</li>
<li>Open-source development is unstoppable and has never been more alive! <img src="https://s.w.org/images/core/emoji/14.0.0/72x72/1f973.png" alt="?" class="wp-smiley" style="height: 1em; max-height: 1em;" /></li>
</ul>
<h2 class="wp-block-heading">Diving Into the Document</h2>
<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="615" height="922" src="https://blog.finxter.com/wp-content/uploads/2023/05/image-52.png" alt="" class="wp-image-1339930" srcset="https://blog.finxter.com/wp-content/uploads/2023/05/image-52.png 615w, https://blog.finxter.com/wp-content/uplo...00x300.png 200w" sizes="(max-width: 615px) 100vw, 615px" /></figure>
</div>
<p>A leaked <a href="https://www.semianalysis.com/p/google-we-have-no-moat-and-neither" data-type="URL" data-id="https://www.semianalysis.com/p/google-we-have-no-moat-and-neither" target="_blank" rel="noreferrer noopener">Google document</a> titled <em>“We Have No Moat, And Neither Does OpenAI”</em> has recently garnered attention. Shared anonymously on a public Discord server, the document comes from a Google researcher and offers a frank analysis of the AI development landscape.</p>
<p>The document contends that <strong>open-source AI development is prevailing, leaving Google and other companies without a competitive edge</strong>. </p>
<p>Considering Google’s status as an AI leader and its substantial investments, this is a notable claim.</p>
<p class="has-base-2-background-color has-background"><img src="https://s.w.org/images/core/emoji/14.0.0/72x72/1f4a1.png" alt="?" class="wp-smiley" style="height: 1em; max-height: 1em;" /> <strong>Quote</strong>: <em>“But the uncomfortable truth is, we aren’t positioned to win this arms race and neither is OpenAI. While we’ve been squabbling, a third faction has been quietly eating our lunch.”</em></p>
<p>Here are some interesting developments in the open-source community:</p>
<ul>
<li><strong>Offline Fast LLMs: </strong>As reported in a <a rel="noreferrer noopener" href="https://blog.finxter.com/gpt4all-quickstart-offline-chatbot-on-your-computer/" data-type="post" data-id="1257342" target="_blank">recent Finxter article</a>, many large language models can now be run offline. A <a rel="noreferrer noopener" href="https://twitter.com/thiteanish/status/1635678053853536256" data-type="URL" data-id="https://twitter.com/thiteanish/status/1635678053853536256" target="_blank">Twitter user</a> even shared how he ran a foundation model on a Pixel 6 at 5 tokens per second speed!</li>
<li><strong>Scalable Personal AI:</strong> Projects like <a rel="noreferrer noopener" href="https://github.com/tloen/alpaca-lora" data-type="URL" data-id="https://github.com/tloen/alpaca-lora" target="_blank">Alpaca-Lora</a> allow you to fine-tune a personalized AI on your notebook in a couple of hours.</li>
<li><strong>Multimodality:</strong> Researchers release new multimodal models that are trained in less than one hour and are freely available via GitHub. <a rel="noreferrer noopener" href="https://arxiv.org/pdf/2303.16199.pdf" data-type="URL" data-id="https://arxiv.org/pdf/2303.16199.pdf" target="_blank">Here</a>‘s the paper.</li>
<li><strong>Responsible Release: </strong>You can find a list of pre-trained LLMs for textual data generation on myriads of new <a rel="noreferrer noopener" href="https://medium.com/geekculture/list-of-open-sourced-fine-tuned-large-language-models-llm-8d95a2e0dc76" target="_blank">websites</a>. Other <a rel="noreferrer noopener" href="https://civitai.com/" target="_blank">websites</a> now share generative art models, generated by Midjourney or <a rel="noreferrer noopener" href="https://blog.finxter.com/i-created-my-first-dall%c2%b7e-image-in-python-openai-using-four-easy-steps/" target="_blank">DALL-E</a>, without restrictions. See an example here: <img src="https://s.w.org/images/core/emoji/14.0.0/72x72/1f447.png" alt="?" class="wp-smiley" style="height: 1em; max-height: 1em;" /></li>
</ul>
<div class="wp-block-image">
<figure class="aligncenter size-large"><img decoding="async" loading="lazy" width="1024" height="575" src="https://blog.finxter.com/wp-content/uploads/2023/05/image-49-1024x575.png" alt="" class="wp-image-1339895" srcset="https://blog.finxter.com/wp-content/uploads/2023/05/image-49-1024x575.png 1024w, https://blog.finxter.com/wp-content/uplo...00x168.png 300w, https://blog.finxter.com/wp-content/uplo...68x431.png 768w, https://blog.finxter.com/wp-content/uplo...36x862.png 1536w, https://blog.finxter.com/wp-content/uplo...age-49.png 1618w" sizes="(max-width: 1024px) 100vw, 1024px" /></figure>
</div>
<p class="has-text-align-center"><a href="https://civitai.com/" data-type="URL" data-id="https://civitai.com/" target="_blank" rel="noreferrer noopener">source</a></p>
<div class="wp-block-image">
<figure class="aligncenter size-large"><a href="https://arxiv.org/pdf/2303.16199.pdf" target="_blank" rel="noreferrer noopener"><img decoding="async" loading="lazy" width="1024" height="848" src="https://blog.finxter.com/wp-content/uploads/2023/05/image-50-1024x848.png" alt="" class="wp-image-1339897" srcset="https://blog.finxter.com/wp-content/uploads/2023/05/image-50-1024x848.png 1024w, https://blog.finxter.com/wp-content/uplo...00x248.png 300w, https://blog.finxter.com/wp-content/uplo...68x636.png 768w, https://blog.finxter.com/wp-content/uplo...age-50.png 1062w" sizes="(max-width: 1024px) 100vw, 1024px" /></a></figure>
</div>
<p>The researcher suggests that <strong>instead of competing with open-source AI, Google and other companies should concentrate on creating tools and infrastructure to support it.</strong> This strategy would ensure rapid AI advancements and widespread benefits.</p>
<p>Check out this wonderful analysis from the article:</p>
<p class="has-base-2-background-color has-background"><img src="https://s.w.org/images/core/emoji/14.0.0/72x72/1f4a1.png" alt="?" class="wp-smiley" style="height: 1em; max-height: 1em;" /> <strong>Quote</strong>: <em>“Many of the new ideas are from ordinary people. The barrier to entry for training and experimentation has dropped from the total output of a major research organization to one person, an evening, and a beefy laptop.”</em></p>
<p>The leak has sparked significant debate within the AI community, with some criticizing Google for not adequately supporting open-source AI and others lauding the company for recognizing its own limitations.</p>
<div class="wp-block-image">
<figure class="aligncenter size-full"><img decoding="async" loading="lazy" width="1006" height="761" src="https://blog.finxter.com/wp-content/uploads/2023/05/image-51.png" alt="" class="wp-image-1339898" srcset="https://blog.finxter.com/wp-content/uploads/2023/05/image-51.png 1006w, https://blog.finxter.com/wp-content/uplo...00x227.png 300w, https://blog.finxter.com/wp-content/uplo...68x581.png 768w" sizes="(max-width: 1006px) 100vw, 1006px" /></figure>
</div>
<h2 class="wp-block-heading">LoRA – An Innovation Worth Keeping In Mind</h2>
<p><a rel="noreferrer noopener" href="https://github.com/microsoft/LoRA" data-type="URL" data-id="https://github.com/microsoft/LoRA" target="_blank">Low-Rank Adaptation of Large Language Models (LoRA)</a> is a powerful technique we should focus on more.</p>
<p>LoRA works by simplifying model updates, making them much smaller and faster to process. This allows us to improve a language model quickly on regular computers, which is great for adding new and diverse information in real-time. Even though this technology could help Google’s ambitious projects, it’s not used enough.</p>
<p>Retraining models from scratch is difficult and time-consuming.</p>
<p>LoRA is effective because it can be combined with other improvements, like instruction tuning. These improvements can be added on top of each other to make the model better over time without needing to start from scratch.</p>
<div class="wp-block-image">
<figure class="aligncenter size-large"><img decoding="async" loading="lazy" width="1024" height="682" src="https://blog.finxter.com/wp-content/uploads/2023/05/image-53-1024x682.png" alt="" class="wp-image-1339931" srcset="https://blog.finxter.com/wp-content/uploads/2023/05/image-53-1024x682.png 1024w, https://blog.finxter.com/wp-content/uplo...00x200.png 300w, https://blog.finxter.com/wp-content/uplo...68x512.png 768w, https://blog.finxter.com/wp-content/uplo...age-53.png 1135w" sizes="(max-width: 1024px) 100vw, 1024px" /></figure>
</div>
<p>This means that when new data or tasks become available, the model can be updated quickly and cheaply. On the other hand, starting from scratch wastes previous improvements and becomes very expensive.</p>
<p>We should think carefully about whether we need a new model for every new idea. If we have major improvements that make reusing old models impossible, we should still try to keep as much of the previous model’s abilities as possible.</p>
<p>I couldn’t resist adding this interesting quote from the article:</p>
<p class="has-base-2-background-color has-background"><img src="https://s.w.org/images/core/emoji/14.0.0/72x72/1f4a1.png" alt="?" class="wp-smiley" style="height: 1em; max-height: 1em;" /> <strong>Quote</strong>: <em>“LoRA updates are very cheap to produce (~$100) for the most popular model sizes. This means that almost anyone with an idea can generate one and distribute it. Training times under a day are the norm. At that pace, it doesn’t take long before the cumulative effect of all of these fine-tunings overcomes starting off at a size disadvantage. Indeed, in terms of engineer-hours, the pace of improvement from these models vastly outstrips what we can do with our largest variants, and the best are already largely indistinguishable from ChatGPT. Focusing on maintaining some of the largest models on the planet actually puts us at a disadvantage.”</em></p>
<h2 class="wp-block-heading">Timeline of LLM Developments (Overview)</h2>
<figure class="wp-block-image size-large"><img decoding="async" loading="lazy" width="1024" height="671" src="https://blog.finxter.com/wp-content/uploads/2023/05/image-55-1024x671.png" alt="" class="wp-image-1339934" srcset="https://blog.finxter.com/wp-content/uploads/2023/05/image-55-1024x671.png 1024w, https://blog.finxter.com/wp-content/uplo...00x197.png 300w, https://blog.finxter.com/wp-content/uplo...68x503.png 768w, https://blog.finxter.com/wp-content/uplo...age-55.png 1135w" sizes="(max-width: 1024px) 100vw, 1024px" /></figure>
<p><strong>Feb 24, 2023</strong> – Meta launches LLaMA, an open-source code with various model sizes.</p>
<p><strong>March 3, 2023</strong> – LLaMA is leaked, allowing anyone to experiment with it.</p>
<p><strong>March 12, 2023</strong> – Artem Andreenko runs LLaMA on a Raspberry Pi.</p>
<p><strong>March 13, 2023</strong> – Stanford releases Alpaca, enabling low-cost fine-tuning of LLaMA.</p>
<p><strong>March 18, 2023</strong> – Georgi Gerganov runs LLaMA on a MacBook CPU using 4-bit quantization.</p>
<p><strong>March 19, 2023</strong> – Vicuna, a cross-university collaboration, achieves “parity” with Bard at $300 training cost.</p>
<p><strong>March 25, 2023</strong> – Nomic creates <a href="https://blog.finxter.com/gpt4all-quickstart-offline-chatbot-on-your-computer/" data-type="post" data-id="1257342" target="_blank" rel="noreferrer noopener">GPT4All</a>, an ecosystem for models like Vicuna, at $100 training cost.</p>
<p><strong>March 28, 2023</strong> – Open Source GPT-3 by Cerebras outperforms existing GPT-3 clones.</p>
<p><strong>March 28, 2023 </strong>– LLaMA-Adapter introduces instruction tuning and multimodality with just 1.2M learnable parameters.</p>
<p><strong>April 3, 2023</strong> – Berkeley launches Koala, users prefer it or have no preference 50% of the time compared to ChatGPT.</p>
<p><strong>April 15, 2023</strong> – Open Assistant launches a model and dataset for Alignment via RLHF, achieving near-ChatGPT human preference levels.</p>
<p class="has-base-2-background-color has-background"><img src="https://s.w.org/images/core/emoji/14.0.0/72x72/1f4a1.png" alt="?" class="wp-smiley" style="height: 1em; max-height: 1em;" /> <strong>Recommended</strong>: <a href="https://blog.finxter.com/6-new-ai-projects-based-on-llms-and-openai/" data-type="URL" data-id="https://blog.finxter.com/6-new-ai-projects-based-on-llms-and-openai/" target="_blank" rel="noreferrer noopener">6 New AI Projects Based on LLMs and OpenAI</a></p>
<h2 class="wp-block-heading">Competing with Open-Source is a Losing Game</h2>
<div class="wp-block-image">
<figure class="aligncenter size-large"><img decoding="async" loading="lazy" width="1024" height="683" src="https://blog.finxter.com/wp-content/uploads/2023/05/image-54-1024x683.png" alt="" class="wp-image-1339933" srcset="https://blog.finxter.com/wp-content/uploads/2023/05/image-54-1024x683.png 1024w, https://blog.finxter.com/wp-content/uplo...00x200.png 300w, https://blog.finxter.com/wp-content/uplo...68x512.png 768w, https://blog.finxter.com/wp-content/uplo...age-54.png 1135w" sizes="(max-width: 1024px) 100vw, 1024px" /></figure>
</div>
<p>I strongly believe in the power of <a href="https://blog.finxter.com/50-ideas-for-open-source-projects/" data-type="post" data-id="1033748" target="_blank" rel="noreferrer noopener">open-source</a> software development — we should build <a href="http://www.catb.org/~esr/writings/cathedral-bazaar/" data-type="URL" data-id="http://www.catb.org/~esr/writings/cathedral-bazaar/" target="_blank" rel="noreferrer noopener">Bazaars not Cathedrals</a>!</p>
<p class="has-global-color-8-background-color has-background"><strong>Open-source AI development is a better approach than closed-source AI development, particularly when considering the potential of Artificial General Intelligence (AGI). </strong>The open-source approach fosters collaboration, accessibility, and transparency, while promoting rapid development, preventing monopolies, and ensuring many benefits.</p>
<p>Here are a few reasons why I think open-source AI development should win in the long-term:</p>
<p>Collaboration is key in open-source AI, as researchers and developers from diverse backgrounds work together to innovate, increasing the likelihood of AGI breakthroughs. </p>
<p>Open-source AI is accessible to anyone, regardless of location or financial resources, which encourages a broader range of perspectives and expertise.</p>
<p>Transparency in open-source AI allows researchers to address biases and ethical concerns, fostering responsible AI development. </p>
<p>By building upon existing work, developers can rapidly advance AI technologies, bringing us closer to AGI. </p>
<p>Open-source AI also reduces the risk of single organizations dominating the AI landscape, ensuring that advancements serve the greater good.</p>
<p>Additionally, the benefits of AI are more evenly distributed across society through open-source AI, preventing the concentration of power and wealth. </p>
<p>Lastly, open-source AI development improves the security of AI systems, as potential flaws can be discovered and fixed by a larger community of researchers and developers.</p>
<p>Let’s end this article with another great quote from the article:</p>
<p class="has-base-2-background-color has-background"><img src="https://s.w.org/images/core/emoji/14.0.0/72x72/1f4a1.png" alt="?" class="wp-smiley" style="height: 1em; max-height: 1em;" /> <strong>Quote</strong>: <em>“Google and OpenAI have both gravitated defensively toward release patterns that allow them to retain tight control over how their models are used. But this control is a fiction. Anyone seeking to use LLMs for unsanctioned purposes can simply take their pick of the freely available models.”</em></p>
<p>Feel free to share this article with your friend <img src="https://s.w.org/images/core/emoji/14.0.0/72x72/2665.png" alt="♥" class="wp-smiley" style="height: 1em; max-height: 1em;" /> and download our <a href="https://blog.finxter.com/openapi-cheat-sheet/" data-type="post" data-id="1317920" target="_blank" rel="noreferrer noopener">OpenAI Python API Cheat Sheet</a> and the following “Glossary” of modern AI terms: </p>
<h2 class="wp-block-heading">OpenAI Glossary Cheat Sheet (100% Free PDF Download) <img src="https://s.w.org/images/core/emoji/14.0.0/72x72/1f447.png" alt="?" class="wp-smiley" style="height: 1em; max-height: 1em;" /></h2>
<p>Finally, check out our free cheat sheet on OpenAI terminology, many Finxters have told me they love it! <img src="https://s.w.org/images/core/emoji/14.0.0/72x72/2665.png" alt="♥" class="wp-smiley" style="height: 1em; max-height: 1em;" /> </p>
<div class="wp-block-image">
<figure class="aligncenter size-full"><a href="https://blog.finxter.com/openai-glossary/" target="_blank" rel="noreferrer noopener"><img decoding="async" loading="lazy" width="720" height="960" src="https://blog.finxter.com/wp-content/uploads/2023/04/Finxter_OpenAI_Glossary-1.jpg" alt="" class="wp-image-1278472" srcset="https://blog.finxter.com/wp-content/uploads/2023/04/Finxter_OpenAI_Glossary-1.jpg 720w, https://blog.finxter.com/wp-content/uplo...25x300.jpg 225w" sizes="(max-width: 720px) 100vw, 720px" /></a></figure>
</div>
<p class="has-base-2-background-color has-background"><img src="https://s.w.org/images/core/emoji/14.0.0/72x72/1f4a1.png" alt="?" class="wp-smiley" style="height: 1em; max-height: 1em;" /> <strong>Recommended</strong>: <a href="https://blog.finxter.com/openai-glossary/" data-type="post" data-id="1276420" target="_blank" rel="noreferrer noopener">OpenAI Terminology Cheat Sheet (Free Download PDF)</a></p>
</div>
https://www.sickgaming.net/blog/2023/05/...es-openai/
<div>
<div class="kk-star-ratings kksr-auto kksr-align-left kksr-valign-top" data-payload='{"align":"left","id":"1339877","slug":"default","valign":"top","ignore":"","reference":"auto","class":"","count":"0","legendonly":"","readonly":"","score":"0","starsonly":"","best":"5","gap":"5","greet":"Rate this post","legend":"0\/5 - (0 votes)","size":"24","title":"Google Says "We Have No Moat, And Neither Does OpenAI"","width":"0","_legend":"{score}\/{best} - ({count} {votes})","font_factor":"1.25"}'>
<div class="kksr-stars">
<div class="kksr-stars-inactive">
<div class="kksr-star" data-star="1" style="padding-right: 5px">
<div class="kksr-icon" style="width: 24px; height: 24px;"></div>
</p></div>
<div class="kksr-star" data-star="2" style="padding-right: 5px">
<div class="kksr-icon" style="width: 24px; height: 24px;"></div>
</p></div>
<div class="kksr-star" data-star="3" style="padding-right: 5px">
<div class="kksr-icon" style="width: 24px; height: 24px;"></div>
</p></div>
<div class="kksr-star" data-star="4" style="padding-right: 5px">
<div class="kksr-icon" style="width: 24px; height: 24px;"></div>
</p></div>
<div class="kksr-star" data-star="5" style="padding-right: 5px">
<div class="kksr-icon" style="width: 24px; height: 24px;"></div>
</p></div>
</p></div>
<div class="kksr-stars-active" style="width: 0px;">
<div class="kksr-star" style="padding-right: 5px">
<div class="kksr-icon" style="width: 24px; height: 24px;"></div>
</p></div>
<div class="kksr-star" style="padding-right: 5px">
<div class="kksr-icon" style="width: 24px; height: 24px;"></div>
</p></div>
<div class="kksr-star" style="padding-right: 5px">
<div class="kksr-icon" style="width: 24px; height: 24px;"></div>
</p></div>
<div class="kksr-star" style="padding-right: 5px">
<div class="kksr-icon" style="width: 24px; height: 24px;"></div>
</p></div>
<div class="kksr-star" style="padding-right: 5px">
<div class="kksr-icon" style="width: 24px; height: 24px;"></div>
</p></div>
</p></div>
</div>
<div class="kksr-legend" style="font-size: 19.2px;"> <span class="kksr-muted">Rate this post</span> </div>
</p></div>
<h2 class="wp-block-heading">Key Points</h2>
<ul class="has-global-color-8-background-color has-background">
<li>The <a href="https://www.semianalysis.com/p/google-we-have-no-moat-and-neither" data-type="URL" data-id="https://www.semianalysis.com/p/google-we-have-no-moat-and-neither" target="_blank" rel="noreferrer noopener">leaked document</a> is titled <strong>“We Have No Moat, And Neither Does OpenAI.”</strong></li>
<li>It argues that <strong>open-source AI development is winning</strong> and that Google and other companies have no competitive advantage or “moat” in the field.</li>
<li>The document suggests that Google and other companies should <strong>focus on building tools and infrastructure that support open-source</strong> AI development rather than trying to compete with it.</li>
<li>The document provides a fascinating insight into the state of AI development and the challenges facing companies like Google as they try to stay ahead of the curve.</li>
<li>Open-source development is unstoppable and has never been more alive! <img src="https://s.w.org/images/core/emoji/14.0.0/72x72/1f973.png" alt="?" class="wp-smiley" style="height: 1em; max-height: 1em;" /></li>
</ul>
<h2 class="wp-block-heading">Diving Into the Document</h2>
<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="615" height="922" src="https://blog.finxter.com/wp-content/uploads/2023/05/image-52.png" alt="" class="wp-image-1339930" srcset="https://blog.finxter.com/wp-content/uploads/2023/05/image-52.png 615w, https://blog.finxter.com/wp-content/uplo...00x300.png 200w" sizes="(max-width: 615px) 100vw, 615px" /></figure>
</div>
<p>A leaked <a href="https://www.semianalysis.com/p/google-we-have-no-moat-and-neither" data-type="URL" data-id="https://www.semianalysis.com/p/google-we-have-no-moat-and-neither" target="_blank" rel="noreferrer noopener">Google document</a> titled <em>“We Have No Moat, And Neither Does OpenAI”</em> has recently garnered attention. Shared anonymously on a public Discord server, the document comes from a Google researcher and offers a frank analysis of the AI development landscape.</p>
<p>The document contends that <strong>open-source AI development is prevailing, leaving Google and other companies without a competitive edge</strong>. </p>
<p>Considering Google’s status as an AI leader and its substantial investments, this is a notable claim.</p>
<p class="has-base-2-background-color has-background"><img src="https://s.w.org/images/core/emoji/14.0.0/72x72/1f4a1.png" alt="?" class="wp-smiley" style="height: 1em; max-height: 1em;" /> <strong>Quote</strong>: <em>“But the uncomfortable truth is, we aren’t positioned to win this arms race and neither is OpenAI. While we’ve been squabbling, a third faction has been quietly eating our lunch.”</em></p>
<p>Here are some interesting developments in the open-source community:</p>
<ul>
<li><strong>Offline Fast LLMs: </strong>As reported in a <a rel="noreferrer noopener" href="https://blog.finxter.com/gpt4all-quickstart-offline-chatbot-on-your-computer/" data-type="post" data-id="1257342" target="_blank">recent Finxter article</a>, many large language models can now be run offline. A <a rel="noreferrer noopener" href="https://twitter.com/thiteanish/status/1635678053853536256" data-type="URL" data-id="https://twitter.com/thiteanish/status/1635678053853536256" target="_blank">Twitter user</a> even shared how he ran a foundation model on a Pixel 6 at 5 tokens per second speed!</li>
<li><strong>Scalable Personal AI:</strong> Projects like <a rel="noreferrer noopener" href="https://github.com/tloen/alpaca-lora" data-type="URL" data-id="https://github.com/tloen/alpaca-lora" target="_blank">Alpaca-Lora</a> allow you to fine-tune a personalized AI on your notebook in a couple of hours.</li>
<li><strong>Multimodality:</strong> Researchers release new multimodal models that are trained in less than one hour and are freely available via GitHub. <a rel="noreferrer noopener" href="https://arxiv.org/pdf/2303.16199.pdf" data-type="URL" data-id="https://arxiv.org/pdf/2303.16199.pdf" target="_blank">Here</a>‘s the paper.</li>
<li><strong>Responsible Release: </strong>You can find a list of pre-trained LLMs for textual data generation on myriads of new <a rel="noreferrer noopener" href="https://medium.com/geekculture/list-of-open-sourced-fine-tuned-large-language-models-llm-8d95a2e0dc76" target="_blank">websites</a>. Other <a rel="noreferrer noopener" href="https://civitai.com/" target="_blank">websites</a> now share generative art models, generated by Midjourney or <a rel="noreferrer noopener" href="https://blog.finxter.com/i-created-my-first-dall%c2%b7e-image-in-python-openai-using-four-easy-steps/" target="_blank">DALL-E</a>, without restrictions. See an example here: <img src="https://s.w.org/images/core/emoji/14.0.0/72x72/1f447.png" alt="?" class="wp-smiley" style="height: 1em; max-height: 1em;" /></li>
</ul>
<div class="wp-block-image">
<figure class="aligncenter size-large"><img decoding="async" loading="lazy" width="1024" height="575" src="https://blog.finxter.com/wp-content/uploads/2023/05/image-49-1024x575.png" alt="" class="wp-image-1339895" srcset="https://blog.finxter.com/wp-content/uploads/2023/05/image-49-1024x575.png 1024w, https://blog.finxter.com/wp-content/uplo...00x168.png 300w, https://blog.finxter.com/wp-content/uplo...68x431.png 768w, https://blog.finxter.com/wp-content/uplo...36x862.png 1536w, https://blog.finxter.com/wp-content/uplo...age-49.png 1618w" sizes="(max-width: 1024px) 100vw, 1024px" /></figure>
</div>
<p class="has-text-align-center"><a href="https://civitai.com/" data-type="URL" data-id="https://civitai.com/" target="_blank" rel="noreferrer noopener">source</a></p>
<div class="wp-block-image">
<figure class="aligncenter size-large"><a href="https://arxiv.org/pdf/2303.16199.pdf" target="_blank" rel="noreferrer noopener"><img decoding="async" loading="lazy" width="1024" height="848" src="https://blog.finxter.com/wp-content/uploads/2023/05/image-50-1024x848.png" alt="" class="wp-image-1339897" srcset="https://blog.finxter.com/wp-content/uploads/2023/05/image-50-1024x848.png 1024w, https://blog.finxter.com/wp-content/uplo...00x248.png 300w, https://blog.finxter.com/wp-content/uplo...68x636.png 768w, https://blog.finxter.com/wp-content/uplo...age-50.png 1062w" sizes="(max-width: 1024px) 100vw, 1024px" /></a></figure>
</div>
<p>The researcher suggests that <strong>instead of competing with open-source AI, Google and other companies should concentrate on creating tools and infrastructure to support it.</strong> This strategy would ensure rapid AI advancements and widespread benefits.</p>
<p>Check out this wonderful analysis from the article:</p>
<p class="has-base-2-background-color has-background"><img src="https://s.w.org/images/core/emoji/14.0.0/72x72/1f4a1.png" alt="?" class="wp-smiley" style="height: 1em; max-height: 1em;" /> <strong>Quote</strong>: <em>“Many of the new ideas are from ordinary people. The barrier to entry for training and experimentation has dropped from the total output of a major research organization to one person, an evening, and a beefy laptop.”</em></p>
<p>The leak has sparked significant debate within the AI community, with some criticizing Google for not adequately supporting open-source AI and others lauding the company for recognizing its own limitations.</p>
<div class="wp-block-image">
<figure class="aligncenter size-full"><img decoding="async" loading="lazy" width="1006" height="761" src="https://blog.finxter.com/wp-content/uploads/2023/05/image-51.png" alt="" class="wp-image-1339898" srcset="https://blog.finxter.com/wp-content/uploads/2023/05/image-51.png 1006w, https://blog.finxter.com/wp-content/uplo...00x227.png 300w, https://blog.finxter.com/wp-content/uplo...68x581.png 768w" sizes="(max-width: 1006px) 100vw, 1006px" /></figure>
</div>
<h2 class="wp-block-heading">LoRA – An Innovation Worth Keeping In Mind</h2>
<p><a rel="noreferrer noopener" href="https://github.com/microsoft/LoRA" data-type="URL" data-id="https://github.com/microsoft/LoRA" target="_blank">Low-Rank Adaptation of Large Language Models (LoRA)</a> is a powerful technique we should focus on more.</p>
<p>LoRA works by simplifying model updates, making them much smaller and faster to process. This allows us to improve a language model quickly on regular computers, which is great for adding new and diverse information in real-time. Even though this technology could help Google’s ambitious projects, it’s not used enough.</p>
<p>Retraining models from scratch is difficult and time-consuming.</p>
<p>LoRA is effective because it can be combined with other improvements, like instruction tuning. These improvements can be added on top of each other to make the model better over time without needing to start from scratch.</p>
<div class="wp-block-image">
<figure class="aligncenter size-large"><img decoding="async" loading="lazy" width="1024" height="682" src="https://blog.finxter.com/wp-content/uploads/2023/05/image-53-1024x682.png" alt="" class="wp-image-1339931" srcset="https://blog.finxter.com/wp-content/uploads/2023/05/image-53-1024x682.png 1024w, https://blog.finxter.com/wp-content/uplo...00x200.png 300w, https://blog.finxter.com/wp-content/uplo...68x512.png 768w, https://blog.finxter.com/wp-content/uplo...age-53.png 1135w" sizes="(max-width: 1024px) 100vw, 1024px" /></figure>
</div>
<p>This means that when new data or tasks become available, the model can be updated quickly and cheaply. On the other hand, starting from scratch wastes previous improvements and becomes very expensive.</p>
<p>We should think carefully about whether we need a new model for every new idea. If we have major improvements that make reusing old models impossible, we should still try to keep as much of the previous model’s abilities as possible.</p>
<p>I couldn’t resist adding this interesting quote from the article:</p>
<p class="has-base-2-background-color has-background"><img src="https://s.w.org/images/core/emoji/14.0.0/72x72/1f4a1.png" alt="?" class="wp-smiley" style="height: 1em; max-height: 1em;" /> <strong>Quote</strong>: <em>“LoRA updates are very cheap to produce (~$100) for the most popular model sizes. This means that almost anyone with an idea can generate one and distribute it. Training times under a day are the norm. At that pace, it doesn’t take long before the cumulative effect of all of these fine-tunings overcomes starting off at a size disadvantage. Indeed, in terms of engineer-hours, the pace of improvement from these models vastly outstrips what we can do with our largest variants, and the best are already largely indistinguishable from ChatGPT. Focusing on maintaining some of the largest models on the planet actually puts us at a disadvantage.”</em></p>
<h2 class="wp-block-heading">Timeline of LLM Developments (Overview)</h2>
<figure class="wp-block-image size-large"><img decoding="async" loading="lazy" width="1024" height="671" src="https://blog.finxter.com/wp-content/uploads/2023/05/image-55-1024x671.png" alt="" class="wp-image-1339934" srcset="https://blog.finxter.com/wp-content/uploads/2023/05/image-55-1024x671.png 1024w, https://blog.finxter.com/wp-content/uplo...00x197.png 300w, https://blog.finxter.com/wp-content/uplo...68x503.png 768w, https://blog.finxter.com/wp-content/uplo...age-55.png 1135w" sizes="(max-width: 1024px) 100vw, 1024px" /></figure>
<p><strong>Feb 24, 2023</strong> – Meta launches LLaMA, an open-source code with various model sizes.</p>
<p><strong>March 3, 2023</strong> – LLaMA is leaked, allowing anyone to experiment with it.</p>
<p><strong>March 12, 2023</strong> – Artem Andreenko runs LLaMA on a Raspberry Pi.</p>
<p><strong>March 13, 2023</strong> – Stanford releases Alpaca, enabling low-cost fine-tuning of LLaMA.</p>
<p><strong>March 18, 2023</strong> – Georgi Gerganov runs LLaMA on a MacBook CPU using 4-bit quantization.</p>
<p><strong>March 19, 2023</strong> – Vicuna, a cross-university collaboration, achieves “parity” with Bard at $300 training cost.</p>
<p><strong>March 25, 2023</strong> – Nomic creates <a href="https://blog.finxter.com/gpt4all-quickstart-offline-chatbot-on-your-computer/" data-type="post" data-id="1257342" target="_blank" rel="noreferrer noopener">GPT4All</a>, an ecosystem for models like Vicuna, at $100 training cost.</p>
<p><strong>March 28, 2023</strong> – Open Source GPT-3 by Cerebras outperforms existing GPT-3 clones.</p>
<p><strong>March 28, 2023 </strong>– LLaMA-Adapter introduces instruction tuning and multimodality with just 1.2M learnable parameters.</p>
<p><strong>April 3, 2023</strong> – Berkeley launches Koala, users prefer it or have no preference 50% of the time compared to ChatGPT.</p>
<p><strong>April 15, 2023</strong> – Open Assistant launches a model and dataset for Alignment via RLHF, achieving near-ChatGPT human preference levels.</p>
<p class="has-base-2-background-color has-background"><img src="https://s.w.org/images/core/emoji/14.0.0/72x72/1f4a1.png" alt="?" class="wp-smiley" style="height: 1em; max-height: 1em;" /> <strong>Recommended</strong>: <a href="https://blog.finxter.com/6-new-ai-projects-based-on-llms-and-openai/" data-type="URL" data-id="https://blog.finxter.com/6-new-ai-projects-based-on-llms-and-openai/" target="_blank" rel="noreferrer noopener">6 New AI Projects Based on LLMs and OpenAI</a></p>
<h2 class="wp-block-heading">Competing with Open-Source is a Losing Game</h2>
<div class="wp-block-image">
<figure class="aligncenter size-large"><img decoding="async" loading="lazy" width="1024" height="683" src="https://blog.finxter.com/wp-content/uploads/2023/05/image-54-1024x683.png" alt="" class="wp-image-1339933" srcset="https://blog.finxter.com/wp-content/uploads/2023/05/image-54-1024x683.png 1024w, https://blog.finxter.com/wp-content/uplo...00x200.png 300w, https://blog.finxter.com/wp-content/uplo...68x512.png 768w, https://blog.finxter.com/wp-content/uplo...age-54.png 1135w" sizes="(max-width: 1024px) 100vw, 1024px" /></figure>
</div>
<p>I strongly believe in the power of <a href="https://blog.finxter.com/50-ideas-for-open-source-projects/" data-type="post" data-id="1033748" target="_blank" rel="noreferrer noopener">open-source</a> software development — we should build <a href="http://www.catb.org/~esr/writings/cathedral-bazaar/" data-type="URL" data-id="http://www.catb.org/~esr/writings/cathedral-bazaar/" target="_blank" rel="noreferrer noopener">Bazaars not Cathedrals</a>!</p>
<p class="has-global-color-8-background-color has-background"><strong>Open-source AI development is a better approach than closed-source AI development, particularly when considering the potential of Artificial General Intelligence (AGI). </strong>The open-source approach fosters collaboration, accessibility, and transparency, while promoting rapid development, preventing monopolies, and ensuring many benefits.</p>
<p>Here are a few reasons why I think open-source AI development should win in the long-term:</p>
<p>Collaboration is key in open-source AI, as researchers and developers from diverse backgrounds work together to innovate, increasing the likelihood of AGI breakthroughs. </p>
<p>Open-source AI is accessible to anyone, regardless of location or financial resources, which encourages a broader range of perspectives and expertise.</p>
<p>Transparency in open-source AI allows researchers to address biases and ethical concerns, fostering responsible AI development. </p>
<p>By building upon existing work, developers can rapidly advance AI technologies, bringing us closer to AGI. </p>
<p>Open-source AI also reduces the risk of single organizations dominating the AI landscape, ensuring that advancements serve the greater good.</p>
<p>Additionally, the benefits of AI are more evenly distributed across society through open-source AI, preventing the concentration of power and wealth. </p>
<p>Lastly, open-source AI development improves the security of AI systems, as potential flaws can be discovered and fixed by a larger community of researchers and developers.</p>
<p>Let’s end this article with another great quote from the article:</p>
<p class="has-base-2-background-color has-background"><img src="https://s.w.org/images/core/emoji/14.0.0/72x72/1f4a1.png" alt="?" class="wp-smiley" style="height: 1em; max-height: 1em;" /> <strong>Quote</strong>: <em>“Google and OpenAI have both gravitated defensively toward release patterns that allow them to retain tight control over how their models are used. But this control is a fiction. Anyone seeking to use LLMs for unsanctioned purposes can simply take their pick of the freely available models.”</em></p>
<p>Feel free to share this article with your friend <img src="https://s.w.org/images/core/emoji/14.0.0/72x72/2665.png" alt="♥" class="wp-smiley" style="height: 1em; max-height: 1em;" /> and download our <a href="https://blog.finxter.com/openapi-cheat-sheet/" data-type="post" data-id="1317920" target="_blank" rel="noreferrer noopener">OpenAI Python API Cheat Sheet</a> and the following “Glossary” of modern AI terms: </p>
<h2 class="wp-block-heading">OpenAI Glossary Cheat Sheet (100% Free PDF Download) <img src="https://s.w.org/images/core/emoji/14.0.0/72x72/1f447.png" alt="?" class="wp-smiley" style="height: 1em; max-height: 1em;" /></h2>
<p>Finally, check out our free cheat sheet on OpenAI terminology, many Finxters have told me they love it! <img src="https://s.w.org/images/core/emoji/14.0.0/72x72/2665.png" alt="♥" class="wp-smiley" style="height: 1em; max-height: 1em;" /> </p>
<div class="wp-block-image">
<figure class="aligncenter size-full"><a href="https://blog.finxter.com/openai-glossary/" target="_blank" rel="noreferrer noopener"><img decoding="async" loading="lazy" width="720" height="960" src="https://blog.finxter.com/wp-content/uploads/2023/04/Finxter_OpenAI_Glossary-1.jpg" alt="" class="wp-image-1278472" srcset="https://blog.finxter.com/wp-content/uploads/2023/04/Finxter_OpenAI_Glossary-1.jpg 720w, https://blog.finxter.com/wp-content/uplo...25x300.jpg 225w" sizes="(max-width: 720px) 100vw, 720px" /></a></figure>
</div>
<p class="has-base-2-background-color has-background"><img src="https://s.w.org/images/core/emoji/14.0.0/72x72/1f4a1.png" alt="?" class="wp-smiley" style="height: 1em; max-height: 1em;" /> <strong>Recommended</strong>: <a href="https://blog.finxter.com/openai-glossary/" data-type="post" data-id="1276420" target="_blank" rel="noreferrer noopener">OpenAI Terminology Cheat Sheet (Free Download PDF)</a></p>
</div>
https://www.sickgaming.net/blog/2023/05/...es-openai/