-
Notifications
You must be signed in to change notification settings - Fork 4.6k
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
feat: Twitter Spaces Integration #1550
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @slkzgm! Welcome to the ai16z community. Thanks for submitting your first pull request; your efforts are helping us accelerate towards AGI. We'll review it shortly. You are now a ai16z contributor!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please add back the documentation
…, moving twitter spaces plugins to Eliza repo
Some conflicts that need review, we should prioritize getting this in since it's a pretty big push |
feat: Twitter Spaces Integration
Risks
Low. Existing users who relied on Deepgram by default will still see no change unless they explicitly define a new
TRANSCRIPTION_PROVIDER
. Fallback logic preserves original behavior (Deepgram → OpenAI → Local).Background
What does this PR do?
TRANSCRIPTION_PROVIDER
setting (deepgram
,openai
, orlocal
) with fallback logic.agent-twitter-client
into this repo for better flexibility and less friction in plugin development.maxSpeakers
).Transcription Service Changes
TranscriptionProvider
enum (Deepgram
,OpenAI
, orLocal
) to replace string flags.initialize()
, the provider is chosen in this order:character.settings.transcription
(if the API keys exist),.env
(TRANSCRIPTION_PROVIDER
),character.json
, you can specify:DEEPGRAM_API_KEY
set, the service will use Deepgram; otherwise it continues to the next check.processQueue()
uses aswitch
onthis.transcriptionProvider
to pick the final method (transcribeWithDeepgram
,transcribeWithOpenAI
, ortranscribeLocally
).Flow Recap
Periodic Check
shouldLaunchSpace()
(random chance, business hours, cooldown).manageCurrentSpace()
handles speaker timeouts, occupancy updates, queue acceptance, etc.Space Creation
SpaceConfig
(topics from config or GPT).speakerRequest
,occupancyUpdate
,idleTimeout
, etc.Speaker Logic
activeSpeakers
array + a queue if at capacity (maxSpeakers
).speakerMaxDurationMs
per speaker.Stopping
stopSpace()
finalizes the Space, logs completion, clears states, etc.Configuration
A)
.env
/ Environment VariablesB)
character.json
→"twitterSpaces"
FieldmaxSpeakers
: number of concurrent speakers allowed.topics
: if none are provided, GPT generates them dynamically.randomChance
: probability for each check cycle to spawn a new Space.speakerMaxDurationMs
: maximum time each speaker can speak before removal.What kind of change is this?
Documentation changes needed?
Yes, minimal. We must mention:
TRANSCRIPTION_PROVIDER
in.env
(optional).twitterSpaces
config section incharacter.json
.Testing
Where should a reviewer start?
transcription.service.ts
to review how it resolves conflicts by prioritizing character settings, then.env
, then old fallback.Detailed testing steps
TRANSCRIPTION_PROVIDER
in.env
(or leave it empty to keep old fallback).deepgram
oropenai
.twitterSpaces.randomChance
in the character JSON to1
(for a 100% rate of starting a space).No special database migrations are needed. Basic local runs and logs confirm correct functioning.
Future Improvements