Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to deploy python extensions (shared libraries) with multiple architectures?

Context: one can compile C code such that it can be used as a Python module. The compiled object is a shared library with a specific naming, so Python can find and load it as a module.

Great. I have successfully compiled and tested code such that file "foo.c" becomes a shared library "foo.so", and Python code import foo works.

The goal is to distribute a set of shared libraries for Mac, Linux, and Windows, where import foo loads the appropriate shared library.

Conceptually, I want my distribution to contain a directory with three files:

mypkg/
  ┠─ __init__.py
  ┠─ foo.so    (linux)
  ┠─ foo.dylib (mac)
  ┖─ foo.dll   (windows)

so that from mypkg import foo picks the appropriate library. I do not want to distribute the source code foo.c.

The problem is, Mac will pick the .so file and complain:

ImportError: dlopen(/.../mypkg/foo.so, 0x0002): tried: '/.../mypkg/foo.so' (not a mach-o file)

Is there a pattern / naming scheme which would permit this (short of writing a custom module loader)?

Edit: Explanation why PyPI / pip / wheel-type distribution is not desired... these aren't running in a standard python.exe process.

The main executable is a C program, which enables C-language plugins using an SDK. I've written a C plugin, which provides Python and exposes a Python interface to the original C API (doing Py_Initialize() etc. This C extension looks for, loads, and executes Python plugins. The result is one can now write Python plugins instead of C plugins. Users place Python plugins in a specific directory & each is read and executed. (plugins cannot execute standalone.) That all works fine.

Now, I'm looking at how one of these plugins can define and use a shared library Python module.

main.c -> 1) InitPython
          2) PyImport_Import("plugins/a.py")
          3) PyImport_Import("plugins/b.py")
               -> import mypkg.foo
             ...

If mypkg/foo.py is pure Python, this works great. If foo is a shared library, then it must be named foo.so on Linux and macOS, so I cannot simply ship my plugin as b.py + mypkg/*. I might be able to use pip install --target=plugins foo.whl.

Alternatively I'm testing a different loading mechanism, similar to @shadowtalker's non-recommendation, for mypkg/foo.py:

import os
import platform
_system = platform.system()

from importlib.machinery import ExtensionFileLoader
from importlib.utils import spec_from_file_location

filename = f'{os.path.dirname(__file__)/foo.{_system.lower()}.so'
_loader = ExtensionFileLoader('foo', filename)
_spec = spec_from_file_location('foo', filename)
_mod = _loader.create_module(_spec)
_loader.exec_module(_mod)
from foo import *

It looks like it's working, but I'll continue to test.

like image 658
pbuck Avatar asked Dec 28 '25 22:12

pbuck


1 Answers

This is handled by the binary package distribution format called Wheel: https://pythonwheels.com/

You will end up building one a separate wheel for each platform, and each wheel will contain only the .so/.dylib/.dll files that are needed for that particular platform.

Just about every major library in the Python ecosystem today distributes their code in the wheel format, even those that do not use compiled extensions.

If you are using a PEP-517-compatible build backend (Setuptools, Flit, Hatch, Poetry) building a wheel is as simple as pip install build && python -m build. This will produce a wheel file with a standardized filename, which can be uploaded directly to PyPI for distribution. When end users want to install your package, Pip will download the version of the wheel that is compatible with the user's system.

However, this will not handle cross-compiling for multiple platforms. For that, you will need something like a container or VM, or a tool like cibuildwheel.

Note that the details of how to set up your particular build backend to compile an extension, or to include existing compiled artifacts in the wheel, will vary considerably depending on the build backend you are using.

Finally, if you're unfamiliar with the concept of a "PEP 517 build backend", see here for a brief explanation and a tutorial for using Setuptools: https://setuptools.pypa.io/en/latest/build_meta.html

like image 152
shadowtalker Avatar answered Dec 31 '25 11:12

shadowtalker



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!