Vaibhav (VB) Srivastav (@reach_vb)
Codex가 채팅 안에서 CI 상태를 직접 알려주는 편의 기능이 추가됐다는 소식이다. 개발자가 별도 확인 없이도 현재 빌드와 테스트 상태를 빠르게 파악할 수 있어 작업 흐름이 더 매끄러워졌다.
Vaibhav (VB) Srivastav (@reach_vb)
Codex가 채팅 안에서 CI 상태를 직접 알려주는 편의 기능이 추가됐다는 소식이다. 개발자가 별도 확인 없이도 현재 빌드와 테스트 상태를 빠르게 파악할 수 있어 작업 흐름이 더 매끄러워졌다.
I've upgraded `matrix-goal-bot` with the most important #goalsetting tool - procrastination.
I can now tell the bot to push a goal to "later" and it will exlcude it from the default list for two weeks.
This is amazingly handy for tracking details of things know I'll need to deal with, but cannot focus on, yet.
I think this was the key piece that was missing from my previous goal tracking software.
There's still no #ai involved, but I still intend to update the Cadbury project into am #agenticai with language recognition through a small #LLM, at some point.
Как мы добавили ИИ-ассистента в рабочий чат и что из этого вышло
У нас небольшая IT-компания - SaaS-продукт, 5 разработчиков, 4 менеджера, CEO. Обычный стек: PHP + Vue, MySQL, GitHub, Telegram для коммуникации. Ничего революционного.
https://habr.com/ru/articles/1025690/
#aiassistant #chatops #llm #claude #github #автоматизация #itпроцессы #devops
OpAMP server with MCP – aka conversational Fluent Bit control
I’ve written a few times about how OpAMP (Open Agent Management Protocol) may emerge from the OpenTelemetry CNCF project, but like OTLP (OpenTelemetry Protocol), it applies to just about any observability agent, not just the OTel Collector. As a side project, giving a real-world use case work on my Python skills, as well as an excuse to work with FastMCP (and LangGraph shortly). But also to bring the evolved idea of ChatOps (see here and here).
One of the goals of ChatOps was to free us from having to actively log into specific tools to mine for information once metrics, traces, and logs reach the aggregating back ends, but being able to. If we leverage a decent LLM with Model Context Protocol tools through an app such as Claude Desktop or ChatGPT (or their mobile variants). Ideally, we have a means to free ourselves to use social collaboration tools, rather than being tied to a specific LLM toolkit.
With a UI and the ability to communicate with Fluentd and Fluent Bit without imposing changes on the agent code base (we use a supervisor model), issue commands, track what is going on, and have the option of authentication. (more improvements in this space to come).
New ChatOps – Phase 1
With the first level of the new ChatOps dynamism being through LLM desktop tooling and MCP, the following are screenshots showing how we’ve exposed part of our OpAMP server via APIs. As you can see in the screenshot within our OpAMP server, we have the concept of commands. What we have done is take some of the commands described in the OpAMP spec, call them standard commands, and then define a construct for Custom Commands (which can be dynamically added to the server and client).
The following screenshot illustrates using plain text rather than trying to come up with structured English to get the OpAMP server to shut down a Fluentd node (in this case, as we only had 1 Fluentd node, it worked out which node to stop).
Interesting considerations
What will be interesting to see is the LLM token consumption changes as the portfolio of managed agents changes, given that, to achieve the shutdown, the LLM will have had to obtain all the Fluent Bit & Fluentd instances being managed. If we provide an endpoint to find an agent instance, would the LLM reason to use that rather than trawl all the information?
Next phase
ChatGPT, Claude Desktop, and others already incorporate some level of collaboration capabilities if the users involved are on a suitable premium account (Team/Enterprise). It would be good to enable greater freedom and potentially lower costs by enabling the capability to operate through collaboration platforms such as Teams and Slack. This means the next steps need to look something along the lines of:
#AI #chatops #FluentBit #Fluentd #LangGraph #LLM #MCP #OpAMP #OpenTelemetry #OTel #OTLPSherlockOps, или как мы победили мониторинг
На протяжении всего моего опыта работы DevOps-инженером, я всегда терпеть не мог мониторинг, алерты и всё что с этим связано. Мало того, что я не любил всё это настраивать, но больше всего я ненавидел получать и резолвить алерты. Поэтому мне всегда хотелось иметь какую-то волшебную кнопку, по нажатию на которую я бы мог получить полный контекст алерта и способы решения. И, аллилуйя, появился ИИ.
https://habr.com/ru/articles/1022830/
#DevOps #мониторинг #ии #n8n #ai #sherlockops #chatops #автоматизация #mcp #alertmanager
DevOps как сервис: как выстроить поддержку, унификацию и внедрение новых технологий без хаоса
Меня зовут Андрей Трегубов, я руковожу группой DevOps в Ви.Tech, IT‑дочке ВсеИнструменты.ру. Недавно мы записали подкаст с Александром Крыловым, директором по развитию продукта «Штурвал», и обсудили, где у DevOps заканчивается роль «ребят, которые иногда помогают разработке» и начинается сервисная модель с понятным входом, приоритетами и правилами игры. В этой статье я собрал из нашего разговора самое практичное: когда DevOps уже пора выстраивать как сервис, почему на росте перестает работать героизм, как не утонуть в поддержке, что делать с зоопарком технологий и где в этой истории нужны Enabling Team, безопасность и ИИ. Можно долго спорить о терминах, но на практике все очень приземленно. Есть команда, к ней приходят с запросами, она что‑то делает, где‑то автоматизирует, где‑то чинит, где‑то вытаскивает всех из пожара. И какое‑то время это даже выглядит нормально. Проблема в том, что «нормально» и «масштабируется» далеко не одно и то же.
https://habr.com/ru/articles/1019002/
#DevOps_как_сервис #DevOps_as_a_Service #Enabling_Team #техрадар #унификация_деплоя #инфраструктурный_техдолг #технический_долг #приоритизация #ChatOps
While my work life is full of conversations about #ai and #quantumcomputing I am streamlining my weekends with #chatops
Meet Cadbury, and versatile personal assistant for #Matrix.
https://codeberg.org/EdTheDev/matrix-cadbury
Oddly, it is 100% free of AI.
But I will probably add a tiny language model to make the commands more accessible, at some point.
If you use it and enjoy it, let me know, and I might maintain it for longer.
I made my #matrix #bots composable.
https://codeberg.org/EdTheDev/matrix-cadbury
Mojofull (@furoku)
Google Workspace의 공식 CLI 출시 소식입니다. CLI와 MCP 양쪽에서 사용 가능하다고 하며, 향후 많은 문서 작업과 자료업무가 채팅 기반으로 전환될 것이라고 전망합니다. 첨부된 작업들이 모두 채팅으로 수행될 수 있게 된다는 설명입니다.