| Infosec Exchange | https://infosec.exchange/@mdfranz |
| Bluesky | https://bsky.app/profile/mdfranz.com |
| GitHub | https://github.com/mdfranz |
| Blog | https://blog.mdfranz.com |
| Infosec Exchange | https://infosec.exchange/@mdfranz |
| Bluesky | https://bsky.app/profile/mdfranz.com |
| GitHub | https://github.com/mdfranz |
| Blog | https://blog.mdfranz.com |
The Senate Voicemail (Cisco) is full. Tried calling three times.
Fortunately, Maryland Senators and Reps are fighting the good fight.
https://www.youtube.com/live/stI7ZIb9FDg?si=Wh9T8Gi8wheTRsqi
RamaLama is an open-source developer tool that simplifies the local serving of AI models from any source and facilitates their use for inference in production, all through the familiar language of ...
If I was somewhat "early" to jump on the DuckDB bandwagon, I've been a relative latecomer to ClickHouse. After spending multiple days over the last week (mostly using ClickHouse local, but some with Python and CHDB, and I do have some actual servers deployed on VMs and I've tested the Altinity Operator on K3s) it is definitely growing on me. Also, if you are in a multi-SIEM environment it is frustrating to keep bouncing back and forth between the 3 different query languages used by Elastic, KQL, Splunk, and SumoLogic -- all which are different enough in implementation to make your head explode. It is nice to fall back to common SQL for quickly slicing data. I'm not sure I'll go back to DuckDB. Why? The fact that ClickHouse runs both locally and in a distributed cluster and that I only need to learn a single set of commands whether I'm querying/ingesting data into a cluster or locally or from S3. I also find the command-line editing and client far quicker to use for modifying previous queries. The speed of console SQL blows away the web UI of a SIEM or Jupyter notebooks, once you get fluent. Maybe the Athena clients got better over the last few years (I doubt it) but it is a much better querying experience. And this morning I was setting up a website for a non-profit and I had to use the mysql client and it was terrible. The one thing I do miss about DuckDB is that I haven't figured out how to do an automated CREATE the TABLE from JSON data with ClickHouse. The only way I've been able to getting it working is to do a DESCRIBE then manually create the schema (usually with help of an LLM) to CREATE and then INSERT the data if I want to run in memory, but I could be doing something wrong. It does take some through to lay out your folder structure on S3 to maximize searching across different data (and log) types. (This screenshot was searching a month+ of compressed TLS logs in S3 from Zeek just scratches the surface of what you can do. I know lots of blue teamers love to stand up a local Elasticsearch/OpenSearch cluster on Docker for ingesting and analyzing data but I just won't go there. I'm just too impatient for that, and would rather RAM for all my Chrome tabs or LLMs.) #SIEM #DuckDB #ClickHouse #Analytics #SQL