ctrl + t Windows/Linux or ⌘ + t on macOS to open SphinxSphinx rules allow you to fine-tune behavior within Sphinx AI. Use the settings button (⚙️ button underneath Sphinx’s text box) to open global or local rules for Sphinx AI. Local rules only affect notebooks in the same folder, and global rules affect all your notebooks. Local rules will take precedence in case of a conflict.
You can express any preferences or configuration options for Sphinx AI in natural language or code. For example, you can:
Memories are how Sphinx AI learns how to be a better copilot for you. As you work through the idiosyncrasies of your data, Sphinx AI will store and remember those insights for later use. Future interactions with Sphinx AI will take into account what it has learned and your experience will continuously improve.
You can disable memory generation or usage in Sphinx AI’s settings (⚙️ button -> settings)
Your conversations are saved locally so that you can stop and start a project at any time. If you want to open a new conversation, click the new conversation button at the top left of the Sphinx window. To open an older conversation, click the search button on the top right of the Sphinx AI window and select the target chat.
When you share a notebook with a colleague, Sphinx AI’s thoughts and process are embedded with the file – Sphinx AI can pick up where it left off with your team. You can also delete conversations and contexts – click the search button on the top right of the Sphinx AI window and trash the target chat.
To have Sphinx AI precisely manipulate an individual cell, click the 🪄 Edit with Sphinx button at the bottom of the that cell. Add your prompt and provide context about what you want Sphinx to do with this cell.
Sphinx AI can be configured to utilize Model Context Protocol (MCP) servers. This configuration can be accessed from the “Edit MCP Config” option in Sphinx AI options (⚙️ button -> Edit MCP Config) A typical MCP config has the following format:
{ "mcpServers":
{ "deepwiki": {
"description": "MCP tool to get information about public git repos",
"url": "https://mcp.deepwiki.com/mcp"
},
"linear": {
"description": "Linear MCP server for project management",
"command": "npx",
"args": ["-y", "mcp-remote", "https://mcp.linear.app/sse"]
}
}
}
Each MCP server entry must specify either a command or a url. Command-style servers will run the provided command to locally instantiate an MCP server. They support the following additional arguments:
args: Array of command line argumentscwd: Optional working directory for the commandenv: Optional environment variables for the command, provided as a JSON objectFor url-style servers, url must be a direct URL to the MCP server. Both HTTP/HTTPS and WebSocket connections are supported.
The description parameter is optional, and lets you tell Sphinx more about the MCP server and when to use it.
Sphinx AI can automatically convert your notebook, along with all the context it’s built as you work together, into a Streamlit application which you can deploy and share.
To do so, click the 🚀 Streamlit button at the bottom of the agentic modality selector in the Sphinx chat bar. This will let you prompt Sphinx AI on how you want your Streamlit app to look and operate.
Note that you will need to have Streamlit available in your VSCode Python Interpreter (which can, and often does, differ from the python environment you are using for any given notebook). You can verify what interpreter you are using with >Python: Select Interpreter in the VSCode command palette.
We’re working on adding more integrations with other visualization and BI tools, and we’d love to hear what would be most useful to you!
If you encounter any issues, or have any suggestions or questions for our team, please get in touch! We’d love to help.