Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Cmake big redesign #349

Merged
merged 38 commits into from
Dec 27, 2021
Merged

Cmake big redesign #349

merged 38 commits into from
Dec 27, 2021

Conversation

cerna
Copy link
Contributor

@cerna cerna commented Mar 28, 2021

This is a preview pull request intended for use as a discussion piece in (not only) Machinekit Matrix Room.

@cerna cerna marked this pull request as draft March 28, 2021 21:42
@cerna cerna force-pushed the cmake branch 3 times, most recently from 838bcf7 to 1506a4e Compare April 12, 2021 19:18
@cerna cerna force-pushed the cmake branch 3 times, most recently from 62809dd to 03194ea Compare May 26, 2021 14:24
@cerna
Copy link
Contributor Author

cerna commented May 27, 2021

Few how-tos one need to try this out

Gnerally speaking, the requirement for dependencies is almost the same, so the standard mk-build-deps -irs sudo while also having the control file will get you most of the way. After that you will need the standard debian build tools as stipulated in the Building Tutorial, the CMake version at least 3.20 (requirement) can be had from Snap store or for those who hate Snaps from the official Kitware website and ninja (make will suffice, but it supports ninja too.). Then you will need the Direnv - if you want, can work around it and the build to build the Python modules.

After cloning the pull request, create a new folder machinekit_hal_build next to the one where you have cloned the Machinekit-HAL and run something like:

cd machinekit_hal_build
cmake ../machinekit_hal -G"Ninja Multi-Config" -DBUILD_PYTHON_DISTRIBUTIONS=0
ninja -f build-Debug.ninja
direnv allow ./Debug
(or source ./Debug/.envrc)
cd ./Debug
halrun

and you should be able to run Machinekit-HAL (so far without the comp and icomp modules, no HAL drives and no Python [needs more work]).

@cerna
Copy link
Contributor Author

cerna commented May 31, 2021

What would be the best way how to structuralize complex managed HAL modules like the hostmot one or even the hal_pru_generic? Because what I have been trying to do is to create - of course where possible and where it makes sense - one target (in CMake terminology) per directory. (For example, the component timedeltav2 is comprised from really just two files in the source-tree: the timedeltav2.icomp and CMakeLists.txt.) But these only depend on Machinekit-HAL's core, pretty much nothing else. On the other hand, the Hostmot2 type of modules depend on each other, i.e. you need to first load one into the loader namespace and then another, kind of like with how the hal module is depending on the rtapi module. Similarly, with the hal_pru_generic you have the managed HAL module and then the firmware .bin files.

So one way how to do it is to create the hostmot2 mount-point (which is very similar to how it was originally) like in this state, however then it would probably need to put there even the mesa_7i65 and mesa_uart which originally were not in the hostmot2 umbrella folder.

The other way is to pull all without branching to the driver directory. The same goes for the hal_pru_generic and its module and firmware.

@cerna
Copy link
Contributor Author

cerna commented May 31, 2021

@jallwine,

looking at the hal_pru_generic HAL module, is there any simple or simplish way how to ad-hoc build the rsct.bin file used during the linking stage from a human-readable text file? (To not distribute a binary blob with the repository.)

Also, there is the part of the Submakefile:

install_prubin:
	(...)
#	#  Copy .bin and .dbg symlinks to the rtlib/modules directory
	$(DIR) $(DESTDIR)$(HAL_RTLIB_DIR)
	cp -a $(RTLIBDIR)/*.bin $(RTLIBDIR)/*.dbg $(DESTDIR)$(HAL_RTLIB_DIR)

Which in the end means that the .bin and .dbg files are installed to two different places. Could it be made to work that it would need to be installed only in one place?

Thank you.

@jallwine
Copy link
Contributor

I’m sure the rsct.bin file could be generated at build time. I’ll have to dig up my notes on how I generated that, but it’s basically an empty resource table that is only necessary to convert the .bin files that are generated by pasm into an ELF format suitable for use with rproc.

As far as I know the .bin files and .dbg files can be anywhere. The path is specified in the instantiation of hal_pru_generic, so configurations would need to be changed if the path changes.

@cdsteinkuehler may have thoughts as well.

@jallwine
Copy link
Contributor

The way I generated the rsct.bin file was based on this script: https://github.com/rogerq/pru-eth-firmware/blob/master/update_firmware.sh

I believe I compiled a simple PRU program from the BeagleBone examples that can be executed with rproc and stripped out the resource table. I’m not very familiar with the ELF format, but there has to be a better way to get a basic resource table.

@jallwine
Copy link
Contributor

jallwine commented Jun 2, 2021

Digging a little further, there's a resource_table_empty.h header file that provides the empty resource table: https://git.ti.com/cgit/pru-software-support-package/pru-software-support-package/tree/labs/Getting_Started_Labs/c_code/solution/am572x/resource_table_empty.h

@cerna
Copy link
Contributor Author

cerna commented Jun 2, 2021

Digging a little further, there's a resource_table_empty.h header file that provides the empty resource table: https://git.ti.com/cgit/pru-software-support-package/pru-software-support-package/tree/labs/Getting_Started_Labs/c_code/solution/am572x/resource_table_empty.h

Thanks for the link! This looks a bit simpler.

I have done some digging too. There is some example from Klipper - which uses the C interface and not the PASM - but maybe it would be possible to bend it. (Bit outside my area of expertise, so bear with me if I am way off.)

Also, it seems that bits and pieces of the am335x_pru_package are hardcoded into the Machinekit-HAL repository from times of the Machinekit monorepo. Perusing the code, I so far cannot see if there were some changes on the Machinekit-side or if it just an old version of it, but I think the reasonable action would be to fork that repository into the Machinekit organization, write simple packaging for it (should not be hard) and turn it into build dependency, thus eliminate need for building the tool in each build and having this support code in the Machinekit-HAL repository. This would also make it easier when including changes from upstream.

It also appears that when cross-compiling, the current build compiles the pasm twice, one for TARGET and one for HOST architecture. That is, as far as I can tell, also unnecessary.

@jallwine
Copy link
Contributor

jallwine commented Jun 2, 2021

How about something like this? #357

@jallwine
Copy link
Contributor

jallwine commented Jun 2, 2021

As for the am335x_pru_package, I modified some of the code that was copied from that package in order to get support for remoteproc working on the Beaglebone AI (the uio_pruss driver doesn't work on the AI), but still use the uio_pruss driver on the Beaglebone Black. It seems like TI wants everyone to start using remoteproc rather than the uio_pruss driver. I also believe pasm isn't supported any more by TI. They encourage everyone to use gcc or clpru to compile C for the PRU. The PRU side of hal_pru_generic, though, is all implemented in assembly and porting it to C seems like a lot of work. Instead, I figured out how to convert the compiled output from pasm into an ELF file that is suitable to be started/stopped using remoteproc.

So, all that said, I'm not sure how much benefit there is to splitting that stuff out since it's already been integrated into machinekit-hal and I doubt there will be any valuable upstream updates.

@cerna
Copy link
Contributor Author

cerna commented Jun 3, 2021

So, all that said, I'm not sure how much benefit there is to splitting that stuff out since it's already been integrated into machinekit-hal and I doubt there will be any valuable upstream updates.

I am trying to find the best possible solution to structuralize the current code into small, isolated pieces of code so that users don't have to go through the whole big codebase and everything looks pretty much uniform and approachable.

The idea to put it inside its own package/repo was born from the fact that it is basically part of the toolchain (akin to GCC) and it would be easier to integrate upstream changes. If you say that TI basically killed the PASM, then the second part is passé and the first one is not just that important (it would need to be build to the side and the Toolchain file will have to include information both about the host and foreign compiler, but that is exactly what is happening now, so pretty much no logical change.)

So, if you think this is the best solution (and as this endeavour is not about rewriting the code, just how it is structured and build), then I will take it.

@jallwine
Copy link
Contributor

jallwine commented Jun 3, 2021

I do think there is tremendous value in breaking code up into more maintainable pieces. It would be nice, for example, to be able to work only on hal_pru_generic without needing to worry about compiling the rest of machinekit-hal. I developed a new HAL component bb_gpio that is an alternative to hal_bb_gpio and haven't bothered to integrate it into machinekit-hal because it's easier to work on it separately. So, I do think there could be value in breaking out pasm and other components, but it's a trade off between future time savings and how long it would take to split up and I don't have a good sense of how hard it is to break it out.

@cerna
Copy link
Contributor Author

cerna commented Jun 23, 2021

The current status is that the acceptance tests (most of what is called runtests) is working and running green. Tests that fail are the ones which should not be part of runtests in the first place - requiring compilation, then the halmodule (not yet implemented), the hostmot2 (not yet implemented) and the halcompile (will need rethinking, as one cannot just install arbitrary code into /usr/lib without going first through the Debian packaging, using /usr/local/lib will be probably required).

Basic testing in Github Actions is also running, which found an error in the Ubuntu 18.04 Bionic when compiling the nanopb code - will need further investigation.

All in all, it is starting to look usable. Hopefully this time around, the curse will not strike.

@cerna
Copy link
Contributor Author

cerna commented Jun 23, 2021

@jallwine,

been trying to look into if and how can I build a toolchain/compiler which I can then use in following steps. And it seems like it is a problem in CMake. As the compiler needs to be available before configuration step when the project() is declared. And it doesn't really like changing compilers during in the tree during the configuration when you have multiple project() declaration in one source tree. (I don't know, maybe I am doing something wrong, this is my first rodeo doing that big transformation.)

So, from this, I think I am going to need the compiler pre-built before calling the CMake configuration. I don't presume you have an experience with this?

@jallwine
Copy link
Contributor

Sorry, definitely not my area of expertise.

@cerna cerna force-pushed the cmake branch 4 times, most recently from 5c47492 to a8e9a70 Compare June 25, 2021 02:53
cerna added 14 commits December 22, 2021 17:16
…ged modules

Change the directory structure of complex structure HAL managed modules.
Resolve problems preventing building Machinekit-HAL for different TARGET architectures via 'gcc' compiler. Use Debian Buster and up for actual testing.
CMake build-system is able to run the 'runtests' from the BUILD tree. Only the acceptance type of tests are working. All which require compilation and thus probably should not be part of 'runtests', but some other test suite, are currently not working.

Not working are also the Python 'halmodule' test and the 'hostmot2' test.

At the moment, 8 tests fail.
This commit repairs builds on Ubuntu 18.04 Bionic. Turns out the Ubuntu 18.04 has too old version of 'protobuf-protoc' compiler and had to be replaced with back-ported one from Ubuntu 20.04 Focal.

This also adds work on tests running, lowering the failures to 5. (And really 2.)
Move the header files from 'HAL' library ('hal_api' INTERFACE library) into folder 'include/hal', which creates a namespace - for example 'hal.h' header should now be included as 'hal/hal.h'.
…y 'runtime'

Namespace all C and C++ libraries under the 'src/libraries' mount-point. Specifically the 'runtime' one (RTAPI) using a higher level abstraction (parts that are a 'runtime', but for some or other reason are in multiple directories and namespaced 'runtime_*' are all in one 'runtime' namespace for imports).
Implement CPack based packaging for the 'libmachinekit-hal', 'libmachinekit-hal-dev' and 'modmachinekit-hal-components' on Debian systems via use of 'dh-cmake' debhelper package.
Implementation of the main set of Debian packages for executables, libraries, development files, modules and test-suites for Machinekit-HAL via the CPack.

This does not implement the Python related parts.
Ubuntu 21.04 Hirsute started to return '-flto=auto -ffat-lto-objects' via 'dpkg-buildflags' in CFLAGS and CXXFLAGS and '-flto=auto' in LDFLAGS, thus activating the Link-Time Optimization. That started to clash with linker script which is hiding the local symbols in HAL managed modules. As a temporary solution, this commit removes aforementioned flags from build.

New target 'setuid' was added to allow setting of setuid bit for 'rtapi_App', 'pci_write' and 'pci_read' when run from BUILD tree. Call with 'sudo make setuid' (when using Makefiles). Also override the dh_fixperms to do the same thing when building packages.
The CMake build process of PRU related components is twofold: Building the HAL managed drivers (MODULE libraries) and assembling the firmware code written in PRU Assembly language. (Represented by 'hal_pru_debug', 'hal_pru' and 'hal_pru_generic' on one side and 'pru_generic', 'pru_generic_bbai', 'pru_generic_direct_io_only', 'pru_generic_bbai_direct_io_only' and 'pru_decamux' on the other side.)

This required implementing new ASM language for the TI PRU Assemble flavour in the CMake. The assembler itself (the PASM) is now distributed as a third-party dependency from Machinekit-HAL's Debian repository.
Implementation of Python packaging workflow using the CPack function and Python installer external executable.

Implementation BeagleBone/Texas Instruments PRU package structure.
Add packaging dependencies for successful run of 'mk-build-deps', 'dkpg-buildpackage' on Debian based systems (Debian Buster, Debian Bullseye, Ubuntu Bionic, Ubuntu Focal, Ubuntu Hirsute).

Modify the tests - both 'runtests' and 'pytests' - to run to completion during CI testing on Debian based systems.
Install CMake config script files to a base operating system under the `/usr/lib/<gnu-triplet>/cmake/Machinekit-HAL`mount-point for the Debian packaging.

Implement missing COMPONENT files for selected targets. (As CMake does not take care of it automatically.)

Make sure the 'usercomp.0' runtest is successful.
Because of a runtime bug, remove the 'oldstable' Debian 10 Buster from supported platforms for Machinekit-HAL.
Modify the README for the CMake based build-system.
Script 'git_watcher.cmake' from https://github.com/andrew-hardin/cmake-git-version-tracking project makes sure the Git SHA1 of the HEAD commit is always up-to-date in Machinekit-HAL's Runtime Config library.
@cerna cerna marked this pull request as ready for review December 22, 2021 16:59
@cerna
Copy link
Contributor Author

cerna commented Dec 22, 2021

Well, it's lot uglier than it should have been and it took lot longer then I anticipated, but that's life. (I made promise to myself to finish it before Christmas...)

@the-snowwhite, ready for merge.

CC @zultron as the original author of #200 (if he wants to do some code review).

@cerna cerna changed the title Cmake redesign: Work In Progress Cmake big redesign Dec 22, 2021
New implementation of the public DRONE CLOUD CI infrastructure for testing the Machinekit-HAL's CMake based build-system on ARM based systems.

Removes the 'armhf' architecture as it was virtualized and not a real platform testing.
@cerna
Copy link
Contributor Author

cerna commented Dec 26, 2021

I re-implemented testing on Drone Cloud (the free tier for Open-Source Software projects) for arm64 architecture. (Only, because as I discovered, Drone has no other hardware and the armhf variant is only virtualized.). So now the build, runtests and pytests runs as Debian packages and local Make/Ninja flows.

Unfortunately, Drone Cloud sucks in a way that it now only allow one job to run at any given time. Even with limited OS versions and not running parts which already run in Github Actions, it still takes around 5 hours. 🤷 (Maybe put all amd64 tests into Github Actions and leave only arm64 ones to Drone Cloud.)

It's better than nothing.

@cerna
Copy link
Contributor Author

cerna commented Dec 27, 2021

The Drone Cloud testing successfully ran to completion - after almost 24 hours, I might add. It was plagued by networking error failing builds on git fetching, Debian package fetching and even simple curl downloading. In other words, nothing which would be pertinent to Machinekit-HAL testing, just some housekeeping tasks.

Given this, the fact I still need to check the rest of the update workflow (namely the Debian repo and change in names), that it is Christmas and people are enjoying and that nobody was that interested in it to begin with, I am going to afford myself an exception and merge this.

@cerna cerna merged commit 508efb2 into machinekit:master Dec 27, 2021
@zultron zultron mentioned this pull request Dec 31, 2021
@cerna cerna deleted the cmake branch January 1, 2022 13:14
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants