Learn how to combine OpenCode with AWS MCP servers for flexible, AI-powered architecture assistance without vendor lock-in.
Read the full guide: https://dev.to/gitaroktato/using-your-own-architecture-agent-with-opencode-and-aws-mcp-servers-2j26
Learn how to combine OpenCode with AWS MCP servers for flexible, AI-powered architecture assistance without vendor lock-in.
Read the full guide: https://dev.to/gitaroktato/using-your-own-architecture-agent-with-opencode-and-aws-mcp-servers-2j26
Learn how to combine OpenCode with AWS MCP servers for flexible, AI-powered architecture assistance without vendor lock-in.
Read the full guide: https://dev.to/gitaroktato/using-your-own-architecture-agent-with-opencode-and-aws-mcp-servers-2j26
I approve of @nlnet holding their ground on requiring the logging of LLM chats:
https://nlnet.nl/foundation/policies/generativeAI/archive/feedback/
"We are open to feedback and we also have to learn ourselves. This is a fast-changing field with many moving parts."
I have feedback!
1) Thanks for having a spine.
2) OpenCode has a convenient feature to export chat logs (in a nice, git-friendly Markdown format). Awesome I say! See Screenshot; the feature is called "Export session transcript" (shortcut: ctrl+x x). Avail yourself of it, my dudes!
I approve of @nlnet holding their ground on requiring the logging of LLM chats:
https://nlnet.nl/foundation/policies/generativeAI/archive/feedback/
"We are open to feedback and we also have to learn ourselves. This is a fast-changing field with many moving parts."
I have feedback!
1) Thanks for having a spine.
2) OpenCode has a convenient feature to export chat logs (in a nice, git-friendly Markdown format). Awesome I say! See Screenshot; the feature is called "Export session transcript" (shortcut: ctrl+x x). Avail yourself of it, my dudes!