Dank des tollen #python Kurses von @piko kann ich hier stolz, meine ersten Versuche mit #GenArt vorstellen, die unser erstes längeres Projekt: Mandalagenerator hervorgebracht hat :yayblob: :blahaj:
Lizenz: CC-BY- mayz
Using Python 3.11 or higher, I want to create an output file and add a line describing it to a log file as an atomic operation: either the output file is created and the log entry is added, or neither happens. fcntl.flock() is only advisory - will something else give me stronger guarantees, preferably on all three major OSes? #python#file-lock #atomic-operation #question
I want something like modulefinder that I can point at a target that outputs a subset of deps/requirements (so I can tree shake requirements in a monorepo).
Does such a tool exist? (before I attempt to write one)
does anyone know of a good #python api wrapper for creating #mastodon#bots? looking to create something similar to reddit's u/cahbot for the fun of it (if it doesn't exist already)
Problems: @pydantic is great for modeling data!! but at the moment it doesn't support array data out of the box. Often array shape and dtype are as important as whether something is an array at all, but there isn't a good way to specify and validate that with the Python type system. Many data formats and standards couple their implementation very tightly with their schema, making them less flexible, less interoperable, and more difficult to maintain than they could be. The existing tools for parameterized array types like nptyping and jaxtyping tie their annotations to a specific array library, rather than allowing array specifications that can be abstract across implementations.
numpydantic is a super small, few-dep, and well-tested package that provides generic array annotations for pydantic models. Specify an array along with its shape and dtype and then use that model with any array library you'd like! Extending support for new array libraries is just subclassing - no PRs or monkeypatching needed. The type has some magic under the hood that uses pydantic validators to give a uniform array interface to things that don't usually behave like arrays - pass a path to a video file, that's an array. pass a path to an HDF5 file and a nested array within it, that's an array. We take advantage of the rest of pydantic's features too, including generating rich JSON schema and smart array dumping.
This is a standalone part of my work with @linkmlarrays and rearchitecting neurobio data formats like NWB to be dead simple to use and extend, integrating with the tools you already use and across the experimental process - specify your data in a simple yaml format, and get back high quality data modeling code that is standards-compliant out of the box and can be used with arbitrary backends. One step towards the wild exuberance of FAIR data that is just as comfortable in the scattered scripts of real experimental work as it is in carefully curated archives and high performance computing clusters. Longer term I'm trying to abstract away data store implementations to bring content-addressed p2p data stores right into the python interpreter as simply as if something was born in local memory.
I just published 0.9.0 version of logmerger, compatible with Python 3.13. Here is an example merge of a client and server log, plus PCAP packet capture file showing TCP send/receive traffic. Uses textual TUI framework for cursor and mouse interaction in a terminal session. https://pypi.org/project/logmerger#python#tcp#networking#textual
🚨 🚨 🚨 We're approaching the Final Call for Proposals for #PyOhio 2024!!! 🚨 🚨 🚨
This Sunday, Anywhere on Earth (AoE) Will be your last chance to submit a talk for our awesome conference!
If you had fun at #PyCon and want to keep hanging out with the #Python Community, or have something you want to share with the rest of us, please submit a talk! We love first time speakers!