missing executable <> package name

phil123456

New Member
Joined
Aug 12, 2020
Messages
5
Reaction score
3
Credits
72
Hello,

I have been using linux distros for years now and if there is something I've always found annoying, no matter if I am working on a RH mainframe or a raspberry pi, is that while compiling a software, a dependency is missing, then I first have to figure the missing package(s) name before even installing it (sometimes for days, asking in forums and such...)

is there still no solution for this ? after all these years ? like a global DB and a wrapper around yum/aptitude/younameit so that it proposes or fetches automatically the correct package that contains the executable or the git repo that needs to be downloaded (far fetched, I admit it)

most of the time, installing issues are about dependencies and just finding the correct missing package name or repo (and then figure out how to compile it)

hope it speaks to others, becos after so many years, that still is a wasting time problem that could be addressed easily, especially with make files and such

thanks for the time you'll spend on this one
 
Last edited:


G'day phil, and Welcome to linux.org

The material you are working with/on is way above my paygrade.....yet I found your post to be a fascinating read.
I will follow this with great interest.

I will link a few members here who may be able to help

@f33dm3bits

@JasKinasis
 
I know what you are talking about, I have to compile software on research systems every now and then. When it stops to complain about a missing package search for it in the repositories or build it from source. You actually got me curious if there was an easier way, the only thing I could find is: apt-get build packagename and yum-builddep packagename. I haven't tried this yet though but it probably has it's limitations since this way it is expecting the package to be in the activated repositories, since you would run into the same problem when newer or older package is needed which isn't in the enable repositories.

Seems GNU Make has a solution for this, but it seems kind of an overkill for the few times I have to compile software myself. And also with package managers like yum and apt-get you see where the package is installed from, the problem you would run into when automatically resolving dependencies when compiling is that you still want control and see from what source it is compiling or resolving the dependencies from. Can those sources be trusted? Is there a way to choose of which sources or repositories it will use to use for dependency resolving, etc. So I don't think it's quite that simple to accomplish. Although emerge Gentoo's package manager seems to be pretty good at automatic dependency resolving, but this is to be expected since it is a source based distro.
 
Last edited:
yeah , the issues realm can be quiet esoteric, if I may say...sometimes it's

- just about to find the name of the bloody package :)
- a whole recipe of things to accomplish to just make it work ant cut to the chase
- about files that should go to /usr/local/lib instead of /usr/lib/whatever
- [insert random use case here ...]

honestly that would make linux so less frustrating, and dependencies resolution more user friendly, even for experienced users

if such a solution existed, it would (let's go wild):

1 - be based on a DB of problems<>recipes (matching on most common error messages)
2 - be available as a friendly web platform that would allow users to reference errors/recipes/versions/platforms/... and have a search engine that would give you suggestions like : install package, install and compile from source (with a scores of people of appreciations of what's most recommended)
3 - need to be fed by good will and users or package maintainers of course
4 - allow for automation around regular build/install processes, and so use people suggestions/scores /votes
5 - propose an interactive mode displaying the recipe that would be executed asking for confirmation
6 - if a step breaks (how to figure this ? regex rules matching on stdout of each executed step?) then come back to interactive mode to ask the operator what to do next

I mean most of the time, the resolution is trivial and can certainty be automated in 90% of cases

eg. zlib.h is missing -> sudo apt-get install libz-dev

a github issue is raised, the moderator sees it's an easy fix, and adds a reference to this problem on that platform , which would later alleviate the need to reply to the same question for other users

another example I just been into :

Untitled.png


just fixed it with:

sudo apt-get install autoconf-archive
./autogen.sh
./configure CFLAGS='-g -O0'

these things can definitely be automatized

just thinking out loud :)
 
Last edited:
Members of the forums live in alot of different time zones. Well I think since most of the distributions are binary distributions they are focused on binaries and that people install packages from repositories where the packages have already been compiled. Compiling is an alternative way to install something if there is no other way to get something installed from a repository. Also it takes much more time to keep track if packages that have been manually installed by building from source and also having to constantly update them. So the norm for binary distros is to install the binary packages unless it is not available if one of the default repositories or one of the 3party trusted repos.
 
the norm ? I did not know there was a norm

you always end up with a version of a software that does not have the last features and packages are quiet infamous for these sort of problems...you cant just always install binaries...unless you're lucky enough to have what you need doing so
 
the norm ? I did not know there was a norm

you always end up with a version of a software that does not have the last features and packages are quiet infamous for these sort of problems...you cant just always install binaries...unless you're lucky enough to have what you need doing so
They aren't called binary distributions for nothing. That is why a lot of developers tend to prefer Ubuntu over distributions such as CentOS/RHEL since there are more recent package availabe. Also the systems have to stay manageable by the sysadmins, if you install hundreds of packages that are compiled from source the system becomes a pain to manage, update and migratie(once the OS is EOL).
 
Last edited:
Sometimes you just have no choice, the package is not there, so either you find an alternative PPA or you git clone the repo and roll up your sleeves :)

Ubuntu is definitely THE system I had problems with ...

I remember spending a whole month trying to compile OpenCV just to have the latest version,

even having to patch some files from the only patch I found on an obscure chinese web site...

now of course, it was on an embedded system (Odroid)

nevertheless the subjects is still interesting... especially with all these embedded systems, for which maintainers don't necessarily provides updates... I sometimes have to spend hours,even days to setup everything, before I even start to work

also I noticed that most RPI/Odroid projects out there are indeed not far fetched at all ...just using a platform, installing ready made packages, plugging some usb peripherals, and we end up with a lot of boring embedded projects...

so I think you're right, most people probably prefer the easier path...
 
If you want/need the latest and greatest you might as well use rolling release distro or a source based distro, makes life a lot easier if you have to compile that much ;)

I sometimes have to compile different R versions, and then I run into that compile dependency hell as well, as well as some other software used for scientific research that usually isn't in the default repositories so I know what you are talking about. Also using any sort of configuration management tool is going to be very hard to use when constantly compiling new versions of software instead of using what is from the default or third party repositories.
 

Members online


Latest posts

Top