Linux installations of packages are now moving to repository based. I know you guys are thinking that by moving your packages into repositories, you eliminate manual work for your audience but this is actually having the opposite effect.
You wanted your audience to add a repository entry and run "yum install" or "apt-get" to get things done. Well that's very wrong!
Your audience consist of technical people. They know how to download packages and move them into their environment for installation. You don't have to help them on this part.
In most if not all computing environments, servers are prohibited from contacting the internet. There's no way for them to do what you want them to do. So what do they have to do? They have to get the network guy to poke a hole in their firewall and grant temporary access to the internet which presents GREAT security risks to their environment.
Not only that, most of the time the sysadm and netadm are two different people. They may even ben in different time zones or continents. The dependency costs up to a day of delay just to get something as simple as downloading a file.
If this was an initial setup for a brand new environment, that's OK. After the initial exposure, we can close it up and go on with our lives. But what about deploying a new tool into an existing environment? What if in the future when production is running and we need to upgrade the tool?
Let me show you few installation instructions that requires internet
Zabbix Server (for mysql)
# yum install http://repo.zabbix.com/zabbix/3.2/rhel/7/x86_64/zabbix-release-3.2-1.el7.noarch.rpm# yum install zabbix-server-mysql zabbix-web-mysql
# rpm -Uvh https://packages.graylog2.org/repo/packages/graylog-2.2-repository_latest.rpm # yum install graylog-server
Guys, why does it take so many people in your organizations to make such illogical decisions? Not only you chose the impossible method, you didn't give your audience the other option, which they can do with ease.