Since I could not find a quickstart to run opengrep with the full set of rules from their fork I thought I'd document what I found out. Setup Download the opengrep binary from github and make it executable with chmod +x . Clone the rules repo: git clone git@github.com:opengrep/opengrep-rules.git and clean it up to make it usable to opengrep: cd opengrep-rules rm -rf ".git",".github",".pre-commit-config.yaml", "elixir", "apex" find . -type f -not -iname "*.yaml" -delete rm -rf .github rm -rf .pre-commit-config.yaml Ensure opengrep can load the rules with: opengrep_manylinux_x86 validate . The same can be done for custom rules maintained in a separate repository. AFAIU Multiple repositories can be specified by repeating -f option as needed, see below. We are now ready to scan a repo, from the repo root directory run: opengrep_manylinux_x86 scan \ -f <path_to>/opengrep-rules \ --error \ --exclude-rule=VAL some ti...
I was recently asked to recover a mirth instance whose embedded database had grown to fill all available space so this is just a note-to-self kind of post. Btw: the recovery, depending on db size and disk speed, is going to take long. The problem A 1.8 Mirth Connect instance was started, then forgotten (well neglected, actually). The user also forgot to setup pruning so the messages filled the embedded Derby database until it grew to fill all the available space on the disk. The SO is linux. The solution First of all: free some disk space so that the database can be started in embedded mode from the cli. You can also copy the whole mirth install to another server if you cannot free space. Depending on db size you will need a corresponding amount of space: in my case a 5GB db required around 2GB to start, process logs and then store the temp files during shrinking. Then open a shell as the user that mirth runs as (you're not running it as root, are you?) and cd in...
Thoughtworks just published volume 33 of their Technology Radar . I found some interesting gems in it that I thought were worthwhile re-sharing: LiteLLM : I've been playing around with it to share AWS Bedrock models over a local, OpenAI-compatible API and I am impressed with the breadth of features (for example budgeting). The AI ecosystem is vibrant and flourishing. Continuous Compliance : so happy to see this mentioned! Personally I would expand the term to include other compliance tools like Vanta and I am convinced that this kind of automation and software will be essential for organizations to scale while meeting increasing regulatory demands. AGENTS.md : as someone who reads Simon's Willison blog, this is no surprise and a welcome confirmation (another file to watch out for: CLAUDE.md ). Oxide : I wrote this post almost exclusively to mention Oxide 😅, a company I admire. Whenever people ask me about my cloud exit strategy, my answer is: Oxide. Here's why .