Client for joinly: Make your meetings accessible to AI Agents
Project description
joinly-client: Client for a conversational meeting agent used with joinly
Prerequisites
Set LLM API key
Export a valid API key for the LLM provider you want to use, e.g. OpenAI:
export OPENAI_API_KEY="sk-..."
Or, create a .env file in the current directory with the following content:
OPENAI_API_KEY="sk-..."
For other providers, export the corresponding environment variable(s) and set provider and model with the command:
uvx joinly-client --llm-provider <provider> --llm-model <model> <MeetingUrl>
Start joinly server
Make sure you have a running joinly server. You can start it with:
docker run -p 8000:8000 ghcr.io/joinly-ai/joinly:latest
For more details on joinly, see the GitHub repository: joinly-ai/joinly.
Command line usage
We recommend using uv for running the client, you can install it using the command in their repository.
Connect to a running joinly server and join a meeting, here loading environment variables from a .env file:
uvx joinly-client --joinly-url http://localhost:8000/mcp/ --env-file .env <MeetingUrl>
Add other MCP servers using a configuration file:
{
"mcpServers": {
"localServer": {
"command": "npx",
"args": ["-y", "[email protected]"]
},
"remoteServer": {
"url": "http://mcp.example.com",
"auth": "oauth"
}
}
}
uvx joinly-client --mcp-config config.json <MeetingUrl>
You can also set other session-specific settings for the joinly server, e.g.:
uvx joinly-client --tts elevenlabs --tts-arg voice_id=EXAVITQu4vr4xnSDxMa6 --lang de <MeetingUrl>
For a full list of command line options, run:
uvx joinly-client --help
Code usage
Direct use of run function:
import asyncio
from dotenv import load_dotenv
from joinly_client import run
load_dotenv()
async def async_run():
await run(
joinly_url="http://localhost:8000/mcp/",
meeting_url="<MeetingUrl>",
llm_provider="openai",
llm_model="gpt-4o-mini",
prompt="You are joinly, a...",
name="joinly",
name_trigger=False,
mcp_config=None, # MCP servers configuration (dict)
settings=None, # settings propagated to joinly server (dict)
)
if __name__ == "__main__":
asyncio.run(async_run())
Or only using the client and a custom agent:
import asyncio
from joinly_client import JoinlyClient
from joinly_client.types import TranscriptSegment
async def run():
client = JoinlyClient(
url="http://localhost:8000/mcp/",
name="joinly",
name_trigger=False,
settings=None,
)
async def on_utterance(segments: list[TranscriptSegment]) -> None:
for segment in segments:
print(f"Received utterance: {segment.text}")
if "marco" in segment.text.lower():
await client.speak_text("Polo!")
client.add_utterance_callback(on_utterance)
async with client:
# optionally, load all tools from the server
# can be used to give all tools to the llm
# e.g., for langchain mcp adapter, use the client.session
tool_list = await client.list_tools()
await client.join_meeting("<MeetingUrl>")
try:
await asyncio.Event().wait() # wait until cancelled
finally:
print(await client.get_transcript()) # print the final transcript
if __name__ == "__main__":
asyncio.run(run())
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file joinly_client-0.1.18.tar.gz.
File metadata
- Download URL: joinly_client-0.1.18.tar.gz
- Upload date:
- Size: 18.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: uv/0.8.22
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4b3edb2818abcba7c275e3e94f8f721e51611d0ad8e3ea98af4a4a50e66d7193
|
|
| MD5 |
361cc3d50c0b86f1099e1aa92bd618ed
|
|
| BLAKE2b-256 |
44e3ddcd838197c3be2be273e9c7215ab72cf13936e6543db242b05183e0465a
|
File details
Details for the file joinly_client-0.1.18-py3-none-any.whl.
File metadata
- Download URL: joinly_client-0.1.18-py3-none-any.whl
- Upload date:
- Size: 20.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: uv/0.8.22
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5c9ea24659d8b835b829e2e21b11ee4d55295b9c74481a8dc2be34c9b6886a75
|
|
| MD5 |
c0934b0e04c73ec8861a10ec8552a57d
|
|
| BLAKE2b-256 |
76269d799223d4bf7cf0b906d2a744f7472271d8d14e3fe20573521748a5196c
|