PromptSuggestion

Smart Dislike · MCP Server

Flow 1 · Developer mode

Use PromptSuggestion via FastMCP (cloud)

This path is for people using LLMs with MCP developer mode. You connect the hosted MCP URL once, then call the tool whenever you want a better follow-up prompt.

Connection steps

Connect the MCP server in your LLM

1

Turn on developer mode

Open your LLM’s settings → Apps / connectors → Advanced settings → enable developer mode support.

2

Create a new MCP app

Back in Apps / Connectors, click “Create” (top-right). Choose any name and description you like.

In the URL field, paste: https://PromptSuggestion.fastmcp.app/mcp

3

No auth needed

In the authentication section, choose “No authentication”, accept the checkbox, and confirm.

4

Refresh & attach the tool

Refresh your chat page. When you start a new conversation, use the + button → More to attach your new MCP tool.

JSON config (for MCP-aware clients that use a config file)

Some tools let you declare HTTP MCP servers in a JSON config. Here’s a generic example:

{
  "mcpServers": {
    "prompt-suggestion": {
      "type": "http",
      "url": "https://PromptSuggestion.fastmcp.app/mcp",
      "authentication": "none"
    }
  }
}

How to call the tool in chat

Once the MCP server is attached to the conversation, just ask your model something like:

  • “Use the prompt-suggestion tool to give me a better follow-up prompt based on the last answer.”
  • “Call the MCP tool to suggest a sharper prompt using our recent conversation and my feedback.”

The model will call the MCP tool, receive one high-quality prompt, and show it back in chat for you to use.