• 1 Post
  • 12 Comments
Joined 2 years ago
cake
Cake day: March 1st, 2024

help-circle
  • project cost = sigma(1...n)(risk likelihood of occurring * risk cost), but we aren’t discussing every possible risk. Only the one risk.

    The risk of having to:

    • for the app to work, requires compiled components
    • having to be familiar with setup.py. This is referred to as the sewer, which is what is targeted by hackers e.g. xv
    • maintainers who come later being familiar and can maintain packages that incorporate other languages e.g. C or rust
    • possibly neglecting to perform the compile (but lets ignore this)
    • compiler runs a binary written and maintained by the spy agency Google

    or

    Just not doing that

    The only justification for going with protoc, over other methods, could only come down to data serialization speed. But in that case, wouldn’t a rust solution be: not only as fast, but also much safer.



  • A better approach

    That unfortunately isn’t a better approach. The compilation step requires protobuf to be installed, by the distro package manager. To my knowledge it’s not available from pypi.

    An uncompiled protobuf file is essentially worthless unless it’s compiled. But if it’s compiled then it’s a binary blob.

    Not anti-protobuf. Just make the protobuf compiler available without getting a distro package manager involved.

    Otherwise slower alternatives might be more viable.

    strictyaml bundles strictyaml.ruamel, which used to be an external unmaintained C package.

    This reduces strictyaml dependencies to:

    pyproject.toml

    dependencies = [
        "python-dateutil>=2.6.0"
    ]
    

    Just that one. So can be confident strictyaml will work.

    Can the same be said for protobuf and Google (over invested in AI and is probably dying underneath a huge debt burden while spending tons of money on AI wash propaganda while not funding Python projects enough. Maintainer leave or burn out while everyone is too busy head fcking us with the AI washing to notice.)








  • wreck can. It’s venv aware. Takes full advantage of hierarchical requirement files. Is intuitive. The learning curve is minimal. Written in Python.

    [[tool.wreck.venvs]]
    venv_base_path = '.venv'
    reqs = [
        'requirements/pip',
        'requirements/pip-tools',
        'requirements/prod',
        'requirements/dev',
        'requirements/manage',
        'requirements/kit',
        'requirements/mypy',
        'requirements/tox',
    ]
    [[tool.wreck.venvs]]
    venv_base_path = '.doc/.venv'
    reqs = [
        'docs/requirements',
        'docs/pip-tools',
    ]
    
    [tool.setuptools.dynamic]
    dependencies = { file = ['requirements/prod.unlock'] }
    optional-dependencies.pip = { file = ['requirements/pip.lock'] }
    optional-dependencies.pip_tools = { file = ['requirements/pip-tools.lock'] }
    optional-dependencies.dev = { file = ['requirements/dev.lock'] }
    optional-dependencies.manage = { file = ['requirements/manage.lock'] }
    optional-dependencies.docs = { file = ['docs/requirements.lock'] }
    

    reqs fix --venv-relpath='.venv'

    reqs fix --venv-relpath='.doc/.venv'

    From *.in requirements files would produce *.unlock and *.lock files for venv .venv. Package versions are sync’ed within all requirements files within that venv.




  • Package DevOPs is a skill. Half assing it will not:

    • impress a published package author’s peers

    • encourage confidence in the authors perseverance to maintain the package

    • encourage confidence in the author’s willingness to politely respond to Issues and PRs

    • package will have issues that never get resolved cuz the author packaging skillset is woefully lacking including but not limited to: Makefile, pre-commit, tox, gha, doctest, documentation, test coverage, multiple platform support, manylinux support, setup for collaboration, and static typing.

    So suggest redirecting efforts towards studying how to improve packaging rather than how to avoid packaging.

    A suggestion on things to improve documentation:

    move the doctest out of in-code documentation and into test suite! So the doctest are proof and contribute to coverage

    python -m coverage run --parallel --data-file=.coverage-combine-tests -m pytest --doctest-glob="*.rst" --showlocals $(verbose_text) tests/safe tests/a tests/b
    

    This is straight from a Makefile. Where the file extension for doctest was changed to .rst. Can see have A B testing setup to compare two packages.

    Example how to include doctest into Sphinx document

    .. literalinclude:: ../tests/a/test_presentation.rst
        :language: pycon
        :name: unique-id
        :caption: illustrative-description