We're headed into a time of pervasive disruption in technology – and size doesn't seem to confer many advantages. If anything, it means that there's more scope for entrepreneurs to create new products, without worrying about whether they infringe on the business models underlying Google, Microsoft, Meta or anyone else. Open source AI has also reminded us of why the PC proliferated: by making it possible to bring home tools that were once only available in the office. It looks as though open source has already outpaced commercial development of both diffusers and transformers. They help us code, plan, write, draw, model, and much else besides.Īnd we won't be beholden to subscriptions to make these new tools work. The machines for which we couldn't dream up a use just a year ago have found their purpose: they're becoming the workhorses of all our generative AI tasks. If you have a monster computer they run very well indeed. In part that's because they're training on many ChatGPT "conversations" that have been shared across sites like Reddit, and they can run well on most PCs. They're improving far more rapidly than anyone expected. Just as DALL-E lost out to Stable Diffusion for usability and extensibility, ChatGPT looks to be losing another race, as researchers produce a range of models – such as Alpaca, Vicuña, Koala, and a menagerie of others – that train and re-train quickly and inexpensively. As LLaMA could run on their lab computers, researchers at Stanford immediately improved LLaMA through their new training technique called Alpaca-Lora, which cut the cost of training an existing set of weightings from hundreds of thousands of dollars down to a few hundred dollars. Meta's researchers offered their weightings to their academic peers, free to download. To preserve Earth's treasures, digital silence is golden.Plus sunsets in July live is almost a religious experience. I liked how it sounded underproduced and almost had the production of most bands 1st releases. 2nd for me is Music 3rd I really enjoyed (gasp here) universal pulse. Conversational AI tells us what we want to hear – a fib that the Web is reliable and friendly Transistor is one of the best rock albums ever period, not just one of the best 311 albums. ![]() Once AI can create endless viral videos, good luck switching off social media.It's time to reveal all recommendation algorithms – by law if necessary.Something very like ChatGPT – which runs on the Azure Cloud because of its massive database of weightings – can be run pretty much anywhere. With a model of only thirty billion parameters, LLaMA can comfortably sit in a PC with 32GB of RAM. In March, Meta released LLaMA – a much more compact and efficient language model, with a comparatively tiny database of weightings, yet with response quality approaching OpenAI's GPT-4.
0 Comments
Leave a Reply. |