Giter Club home page Giter Club logo

melody_agents's Introduction

logo

Melody Agents

When crewAI meets Suno AI

Introduction

Ever thought of how a Heavy Metal song about the last tweet of Elon Musk would sound? Well, this is your repo then ...

Description

This repository contains a crewAI application for automatically generating songs given a topic and a musc genre. The song generation is done using the "unofficial" Suno AI API.

The crew consists of three agents:

1️⃣ Web Researcher Agent

The Web Researcher Agent searches for relevant information in the web about a provided topic.

2️⃣ Lyrics Creator Agent

The Lyrics Creator Agent takes the previous research and generates high quality lyrics from it. In addition, it adapts the lyrics to the provided music genre.

3️⃣ Song Generator Agent

The Son Generator Agent takes the generated lyrics and uses the Suno custom tool to interact with the Suno API and download two candidate songs.

Getting Started

Thr first thing you'll need to do is to create a .env and provide values for the following variables:

SUNO_COOKIE: You'll need this to use the Suno API, please check the instructions in the unofficial Suno API repo. GROQ_API_KEY: Your Groq API Key. Why Groq? Because it's free! SERPER_API_KEY: Your Serper API KEY.

Check the .env.example if you have doubts about the .env structure

Now that you have all the env variables configured, it is time to let the Docker magic begin! This project is configured using docker-compose, where you'll be creating two containers; one container for the Suno API logic and the other for the streamlit + crewAI logic. You can check the docker-compose specification here.

To start the application, simply run:

docker compose build & docker compose up

A Streamlit application will be launched on your localhost, on port 8501.

alt text

If you click the URL, you should see something like this:

alt text

melody_agents's People

Contributors

michaelistrofficus avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar cleverpig avatar  avatar Arthur avatar Devichand avatar Filipe Bento avatar  avatar Ricardo Prado avatar Eric Putnam avatar George Tolakis avatar Ekim Emre Yardimli avatar .MS_Store avatar null avatar  avatar fakealpha avatar G10Rg10C avatar  avatar  avatar Ollie McCarthy avatar Anson Kao avatar Marcelo Henrique Neppel avatar Henry Hamon avatar  avatar Carlos avatar David Hoang avatar Massimo avatar Lee Penkman avatar Rishu Mehrotra avatar Alex Kwiatkowski avatar Ilya avatar Vinit Agrawal avatar Shankar Ambady avatar  avatar bai avatar  avatar  avatar HuFeiHu avatar  avatar Eduardo Blancas avatar

melody_agents's Issues

token limit issues: groq 70b running into rate limit issues even when using your example prompt

AI Thought Bubble - Next Action:

Thought: I need to read more content from other relevant websites and articles to gather more information about the topic.

Action: Read website content

Action Input: {"website_url": "https://www.forbes.com/sites/roberthart/2024/05/28/elon-musk-is-feuding-with-ai-godfather-yann-lecun-again-heres-why/"}

RateLimitError: Error code: 429 - {'error': {'message': 'Rate limit reached for model llama3-70b-8192 in organization org_01hrx3emwtett8bq1cyh7w230q on tokens per minute (TPM): Limit 6000, Used 0, Requested 6194. Please try again in 1.94s. Visit https://console.groq.com/docs/rate-limits for more information.', 'type': 'tokens', 'code': 'rate_limit_exceeded'}}
Traceback:
File "/usr/local/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 600, in _run_script
exec(code, module.dict)File "/app/app.py", line 52, in <module>
result = melody_crew.run()
^^^^^^^^^^^^^^^^^File "/app/crew.py", line 59, in run
return crew.kickoff()
^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/crewai/crew.py", line 252, in kickoff
result = self._run_sequential_process()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/crewai/crew.py", line 293, in _run_sequential_process
output = task.execute(context=task_output)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/crewai/task.py", line 173, in execute
result = self._execute(
^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/crewai/task.py", line 182, in _execute
result = agent.execute_task(
^^^^^^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/crewai/agent.py", line 221, in execute_task
result = self.agent_executor.invoke(
^^^^^^^^^^^^^^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/langchain/chains/base.py", line 163, in invoke
raise eFile "/usr/local/lib/python3.11/site-packages/langchain/chains/base.py", line 153, in invoke
self._call(inputs, run_manager=run_manager)File "/usr/local/lib/python3.11/site-packages/crewai/agents/executor.py", line 124, in _call
next_step_output = self._take_next_step(
^^^^^^^^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/langchain/agents/agent.py", line 1138, in _take_next_step
[File "/usr/local/lib/python3.11/site-packages/langchain/agents/agent.py", line 1138, in <listcomp>
[File "/usr/local/lib/python3.11/site-packages/crewai/agents/executor.py", line 186, in _iter_next_step
output = self.agent.plan(
^^^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/langchain/agents/agent.py", line 397, in plan
for chunk in http://self.runnable.stream(inputs, config={"callbacks": callbacks}):File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2875, in stream
yield from self.transform(iter([input]), config, **kwargs)File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2862, in transform
yield from self._transform_stream_with_config(File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1880, in _transform_stream_with_config
chunk: Output = http://context.run(next, iterator) # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2826, in _transform
for output in final_pipeline:File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1283, in transform
for chunk in input:File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 4728, in transform
yield from self.bound.transform(File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1300, in transform
yield from http://self.stream(final, config, **kwargs)File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 249, in stream
raise eFile "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 229, in stream
for chunk in self._stream(messages, stop=stop, **kwargs):File "/usr/local/lib/python3.11/site-packages/langchain_groq/chat
6CSHI3Tz
xK6APGZt
_models.py", line 321, in _stream
for chunk in self.client.create(messages=message_dicts, **params):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/groq/resources/chat/completions.py", line 289, in create
return self._post(
^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/groq/_base_client.py", line 1225, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/groq/_base_client.py", line 920, in request
return self._request(
^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/groq/_base_client.py", line 1003, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/groq/_base_client.py", line 1051, in _retry_request
return self._request(
^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/groq/_base_client.py", line 1003, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/groq/_base_client.py", line 1051, in _retry_request
return self._request(
^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/site-packages/groq/_base_client.py", line 1018, in _request
raise self._make_status_error_from_response(err.response) from None

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.