Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add the Model Context Protocol integration #135058

Open
wants to merge 11 commits into
base: dev
Choose a base branch
from

Conversation

allenporter
Copy link
Contributor

@allenporter allenporter commented Jan 8, 2025

Proposed change

Add the Model Context Protocol integration, which is a client.

See https://modelcontextprotocol.io/introduction#general-architecture for background on how MCP works. This integration is the opposite of #134122 and acts as a client.

The initial integration adds support for LLM tools only (does not yet include prompts or resources, where we don't actually have a way to support them generally). This registers a new LLM API that exposes tools returned by the remote MCP server. The LLM API can be consumed by another conversation agent integration.

Type of change

  • Dependency upgrade
  • Bugfix (non-breaking change which fixes an issue)
  • New integration (thank you!)
  • New feature (which adds functionality to an existing integration)
  • Deprecation (breaking change to happen in the future)
  • Breaking change (fix/feature causing existing functionality to break)
  • Code quality improvements to existing code or addition of tests

Additional information

Checklist

  • The code change is tested and works locally.
  • Local tests pass. Your PR cannot be merged unless tests pass
  • There is no commented out code in this PR.
  • I have followed the development checklist
  • I have followed the perfect PR recommendations
  • The code has been formatted using Ruff (ruff format homeassistant tests)
  • Tests have been added to verify that the new code works.

If user exposed functionality or configuration variables are added/changed:

If the code communicates with devices, web services, or third-party tools:

  • The manifest file has all fields filled out correctly.
    Updated and included derived files by running: python3 -m script.hassfest.
  • New or updated dependencies have been added to requirements_all.txt.
    Updated by running python3 -m script.gen_requirements_all.
  • For the updated dependencies - a link to the changelog, or at minimum a diff between library versions is added to the PR description.

To help with the load of incoming pull requests:


async def _async_setup(self) -> None:
"""Set up the client connection."""
self.ctx_mgr = mcp_client(self.config_entry.data[CONF_URL])
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a better way to manage the lifecycle of a contextmanager client library?

I'm essentially having it try to mange the lifecycle manually here to start/end the context when the coordinator starts/ends.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The only alternative I could think of, and I think we do this one for Matter server or Z-Wave server (forgot which):

async def _connect(self, connection_established: asyncio.Event) -> None:
    try:
        async with mcp_client(url) as session:
            self.session = session
            connection_established.set()
    finally:
        self.session = None

async def _async_setup(self) -> None:
    connection_established = asyncio.Event()
    self.config_entry.async_create_background_task(self._connect(connection_established))
    async with timeout(10):
        await connection_established.wait()

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And I guess you can keep the connection task around to cancel it for other reasons as well.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@allenporter allenporter mentioned this pull request Jan 9, 2025
19 tasks
homeassistant/components/mcp/manifest.json Outdated Show resolved Hide resolved
homeassistant/components/mcp/coordinator.py Outdated Show resolved Hide resolved
homeassistant/components/mcp/coordinator.py Outdated Show resolved Hide resolved
homeassistant/components/mcp/coordinator.py Show resolved Hide resolved
homeassistant/components/mcp/config_flow.py Show resolved Hide resolved
tests/components/mcp/test_config_flow.py Outdated Show resolved Hide resolved
tests/components/mcp/test_config_flow.py Outdated Show resolved Hide resolved
tests/components/mcp/test_config_flow.py Outdated Show resolved Hide resolved
@home-assistant
Copy link

home-assistant bot commented Jan 9, 2025

Please take a look at the requested changes, and use the Ready for review button when you are done, thanks 👍

Learn more about our pull request process.

@home-assistant home-assistant bot marked this pull request as draft January 9, 2025 09:15
@allenporter allenporter marked this pull request as ready for review January 9, 2025 16:41
@home-assistant home-assistant bot requested a review from joostlek January 9, 2025 16:41
"""Initialize the tool."""
self.name = tool.name
self.description = tool.description
self.parameters = convert_to_voluptuous(tool.inputSchema)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When this raises, should we somewhere log the input schema that brought it down? Currently it will bubble all the way up to the conversation entity I think?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants