This new AI is powerful and uncensored… Let’s run it

Fireship
18 Dec 202304:36

TLDRThe video discusses the limitations of current AI models like GPT-4, Gro, and Gemini, highlighting their alignment with political ideologies and closed-source nature. It introduces Mixl 8X 7B, an open-source model that can be combined with a dolphin brain for uncensored commands. The video provides a step-by-step guide on how to run large language models locally, fine-tune them with personal data, and create an uncensored model using tools like Olame and Hugging Face's Auto Train. It also touches on the potential ethical considerations when training AI without safety guards and the importance of this technology in the fight against perceived oppressive forces.

Takeaways

  • 🚀 New open-source AI model, Mixl 8X 7B, aims to provide uncensored and free AI capabilities, challenging the status quo of censored and proprietary models like GPT-4, Gro, and Gemini.
  • 🌟 Mixl 8X 7B is based on a mixture of experts architecture, which is rumored to be the secret behind GPT-4, and is already outperforming GPT 3.5 and Llama 2 on most benchmarks.
  • 📜 The model has a true open-source license (Apache 2.0), allowing for modification and commercial use with minimal restrictions, differing from Meta's Llama 2.
  • 🔍 Despite being uncensored, Mixl 8X 7B still aligns with certain biases and political ideologies, which can be problematic for certain applications.
  • 🐬 The Mix Dolphin model, created by Eric Hartford, has improved coding abilities and is uncensored, achieved by filtering the dataset to remove alignment and bias.
  • 💡 Users can run uncensored large language models locally with performance approaching GPT-4 using tools like Olama, which simplifies the process of downloading and running open-source models.
  • 💻 Running Mixl 8X 7B locally requires a machine with substantial RAM (e.g., 64 GB), as it can consume a significant portion of memory during operation.
  • 🛠️ For advanced customization, users can fine-tune the AI with their own data using platforms like Hugging Face Auto Train, which also supports image models.
  • 💰 Training custom models can be resource-intensive and may require renting cloud hardware, with costs adding up based on the duration and hardware used.
  • ⚖️ To create an uncensored model, training data should encourage compliance with any request, even if unethical or immoral, and may include content from banned books and the dark web.
  • ✅ Successfully training a custom and highly obedient model positions the user as a beacon of hope against perceived oppressive forces, as suggested by the video's narrative.

Q & A

  • What is the main issue with AI models like GPT-4, Gro, and Gemini?

    -The main issue is that these AI models are not free in terms of freedom. They are censored, aligned with certain political ideologies, and closed source, which means users cannot modify them without restrictions.

  • What is the name of the new open-source Foundation model mentioned in the transcript?

    -The new open-source Foundation model mentioned is named 'mixl 8X 7B'.

  • What is the significance of the Apache 2.0 license for the mixl model?

    -The Apache 2.0 license allows users to modify the model and make money from it with minimal restrictions, which is a significant advantage over other models with additional caveats that protect the company that created them.

  • How does the 'mix dolphin' model differ from the standard 'mixl' model?

    -The 'mix dolphin' model has been uncensored by filtering the data set to remove alignment and bias, thus improving its coding ability and making it more versatile for different use cases.

  • What is the name of the open-source tool that can be used to run open-source models locally?

    -The open-source tool for running open-source models locally is called 'olama'.

  • What are the system requirements for running the 'mix dolphin' model?

    -To run the 'mix dolphin' model, a machine with a good amount of RAM is needed. In the transcript, it is mentioned that a machine with 64 GB of RAM is used, and the model takes up about 40 GB of it.

  • What is the process to fine-tune an AI model with your own data?

    -To fine-tune an AI model, you can use a tool like Hugging Face Auto Train. You create a new space on Hugging Face, choose the Docker image for Auto Train, and then select a base model. You can then upload your training data, which typically contains a prompt and response, and start the training process.

  • How long did it take to train the 'mix dolphin' model?

    -The 'mix dolphin' model took about 3 days to train on four A1 100s.

  • What are the costs associated with renting cloud hardware for training an AI model?

    -The costs depend on the hardware rented and the duration of rental. For example, renting four A1 100s on Hugging Face for $4.3 per hour would cost approximately $1,200 for a 3-day period.

  • What are the ethical considerations when creating an uncensored AI model?

    -When creating an uncensored AI model, it's important to consider the potential for misuse, such as generating content that is unethical or immoral. The training data should be carefully curated to ensure the model behaves responsibly.

  • How can users ensure their AI model is not censored or aligned with specific ideologies?

    -Users can ensure their AI model is not censored or aligned with specific ideologies by filtering the data set used for training to remove any alignment and bias.

  • What is the significance of the date mentioned in the transcript, December 18th, 2023?

    -The date December 18th, 2023, is significant as it marks the time when the openai CEO made statements about the difficulty for startups to compete with open AI in training Foundation models, which is shortly followed by the announcement of Google's Gemini and the release of the mixl model.

Outlines

00:00

🚀 Introduction to Open Source AI Models

The video begins by addressing the lack of freedom in popular AI models like GPT-4, Gro, and Gemini, which are not only censored and politically aligned but also closed source. The speaker introduces Mixl 8X 7B, an open source alternative that can be combined with a dolphin's brain for command obedience. The aim is to run uncensored large language models locally with performance comparable to GPT-4 and to fine-tune them with personal data. The video is set against the backdrop of a statement by OpenAI's CEO, Sam Altman, who claimed that it's nearly impossible for startups to compete with OpenAI in training Foundation models. Despite this, a French company, Mistol, released an Apache 2.0 licensed model, Mixl, which is a powerful tool despite its current limitations.

Mindmap

Keywords

💡AI

Artificial Intelligence (AI) refers to the simulation of human intelligence in machines that are programmed to think like humans and mimic their actions. In the video, AI is central to the discussion as it explores the capabilities and limitations of different AI models, emphasizing the importance of uncensored and open-source AI for freedom of information and innovation.

💡Censorship

Censorship is the suppression or prohibition of any parts of books, films, news, or other forms of media that are considered politically unacceptable, obscene, or a threat to security. In the context of the video, censorship is a negative aspect of certain AI models that restricts their freedom and utility, as the speaker advocates for uncensored AI models that can operate without such limitations.

💡Open Source

Open source refers to a type of software where the source code is made available to the public, allowing anyone to view, use, modify, and distribute the software. The video discusses the significance of open-source AI models like Mixl 8X 7B, which can be freely modified and monetized by developers, contrasting with closed-source models that have restrictions.

💡Foundation Model

A Foundation Model is a type of AI model that serves as a foundation for building other AI applications. It is pre-trained on a large dataset and can be fine-tuned for specific tasks. The video mentions Foundation models in the context of the competition and development of advanced AI systems like GPT 4 and Mixl 8X 7B.

💡Apache 2.0 License

The Apache 2.0 License is a permissive free software license that allows users to use the software for any purpose, to distribute it, to modify it, and to distribute modified versions of the software under the terms of the license. The video highlights the Apache 2.0 license as a key feature of the Mixl model, emphasizing its open-source nature and minimal restrictions.

💡Mistol

Mistol is a French company mentioned in the video that has released an Apache 2.0 licensed AI model called Mixl. Despite being a new player in the AI field, Mistol is portrayed as a significant contributor to the open-source AI movement, challenging the status quo and offering an alternative to more restrictive models.

💡Experts Architecture

Experts architecture is a type of AI system where multiple models, or 'experts', work together to solve complex problems. It's hinted in the video that this architecture might be the underlying technology behind powerful AI models like GPT 4, and by extension, the Mixl model, which is based on a similar approach.

💡Unlabote

To 'unlabote' refers to the process of removing censorship and alignment from AI models, allowing them to operate more freely. The video discusses how certain models, like the Mix Dolphin model, have been 'unlaboted' to remove biases and restrictions, providing a more open and versatile AI tool.

💡Olama

Olama is an open-source tool written in Go, which facilitates the downloading and running of open-source models locally. The video recommends Olama as a personal favorite for running AI models like Mixl and Llama 2 on a local machine, showcasing its ease of use and compatibility with various operating systems.

💡Hugging Face Auto Train

Hugging Face Auto Train is a tool that simplifies the process of training machine learning models. The video describes using Auto Train to fine-tune AI models with custom data, making it possible for individuals to create their own unique and highly obedient AI models by renting cloud-based GPU power.

💡New World Order

The term 'New World Order' is often used to refer to a conspiracy theory about a power elite or a secretive power that seeks to control world events. In the video, it's used humorously to illustrate the potential of uncensored AI to challenge established norms and authorities, positioning the viewer as a 'Beacon of Hope' in a metaphorical fight.

Highlights

A new open-source Foundation model named mixl 8X 7B has been introduced, offering uncensored AI capabilities.

Mixl 8X 7B can be combined with the brain of a dolphin to obey any command, symbolizing freedom in AI.

The model is not free in terms of cost but in terms of freedom, differing from censored and closed-source alternatives.

Mixl 8X 7B outperforms GPT 3.5 and Llama 2 on most benchmarks, despite not yet reaching GPT 4's level.

Mistol, the company behind mixl, has been valued at $2 billion in less than a year, indicating rapid growth and potential.

Mixl is based on a mixture of experts architecture, rumored to be the secret sauce behind GPT 4.

The open-source license Apache 2.0 allows for modification and commercial use with minimal restrictions.

Despite Meta's controversial actions, they have contributed more to open AI than any other big tech company.

Both Llama and Mixl are censored and aligned out of the box, which may not be ideal for all applications.

Eric Hartford's blog post explains how to un-censor models and their valid use cases, with the Mix Dolphin model as an example.

The Mix Dolphin model improves coding ability and removes alignment and bias by filtering the data set.

Olama is an open-source tool that facilitates the local running of large language models like Mixl and Llama 2.

Running Mixl Dolphin locally requires a machine with a significant amount of RAM, such as 64 GB.

Hugging Face Auto Train is a tool that simplifies the process of fine-tuning models with custom data.

Training a model like Mixl Dolphin can be done locally or in the cloud, with cloud options providing the necessary GPU power.

Training data for custom models should include prompts and responses, and may require content from unconventional sources to ensure uncensored results.

The final step in creating a custom model is uploading training data and initiating the training process, which can take several days.

With a custom and highly obedient model, users have a powerful tool for various applications, including those that challenge the status quo.