Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

m2crypto-0.28.2-3.el7.x86_64 in salt repository breaks subscription-manager in RHEL7x #49156

Closed
josephname opened this issue Aug 16, 2018 · 32 comments
Assignees
Labels
info-needed waiting for more info Pending-Discussion The issue or pull request needs more discussion before it can be closed or merged
Milestone

Comments

@josephname
Copy link

Description of Issue/Question

running "yum update" when salt repository https://repo.saltstack.com/yum/redhat/7Server/x86_64/2018.3/ is attached will break subscription-manager in RHEL7.x servers. The error received is Error: /usr/lib64/python2.7/site-packages/M2Crypto/_m2crypto.so: symbol sk_deep_copy, version libcrypto.so.10 not defined in file libcrypto.so.10 with link time reference

Setup

(Please provide relevant configs and/or SLS files (Be sure to remove sensitive info).)

Steps to Reproduce Issue

(Include debug logs if possible and relevant.)

Versions Report

(Provided by running salt --versions-report. Please also mention any differences in master/minion versions.)
MINION:
Salt Version:
Salt: 2018.3.2

Dependency Versions:
           cffi: 0.8.6
       cherrypy: Not Installed
       dateutil: 1.5
      docker-py: Not Installed
          gitdb: Not Installed
      gitpython: Not Installed
          ioflo: Not Installed
         Jinja2: 2.7.2
        libgit2: Not Installed
        libnacl: Not Installed
       M2Crypto: 0.21.1
           Mako: Not Installed
   msgpack-pure: Not Installed
 msgpack-python: 0.5.6
   mysql-python: Not Installed
      pycparser: 2.14
       pycrypto: 2.6.1
   pycryptodome: Not Installed
         pygit2: Not Installed
         Python: 2.7.5 (default, Oct 11 2015, 17:47:16)
   python-gnupg: Not Installed
         PyYAML: 3.11
          PyZMQ: 15.3.0
           RAET: Not Installed
          smmap: Not Installed
        timelib: Not Installed
        Tornado: 4.2.1
            ZMQ: 4.1.4
 
System Versions:
           dist: redhat 7.2 Maipo
         locale: ascii
        machine: x86_64
        release: 3.10.0-327.el7.x86_64
         system: Linux
        version: Red Hat Enterprise Linux Server 7.2 Maipo

MASTER:
Salt Version:
Salt: 2018.3.0

Dependency Versions:
cffi: Not Installed
cherrypy: Not Installed
dateutil: Not Installed
docker-py: Not Installed
gitdb: Not Installed
gitpython: Not Installed
ioflo: Not Installed
Jinja2: 2.8
libgit2: Not Installed
libnacl: Not Installed
M2Crypto: Not Installed
Mako: Not Installed
msgpack-pure: Not Installed
msgpack-python: 0.4.6
mysql-python: Not Installed
pycparser: Not Installed
pycrypto: 2.6.1
pycryptodome: Not Installed
pygit2: Not Installed
Python: 3.4.8 (default, Mar 23 2018, 10:04:27)
python-gnupg: Not Installed
PyYAML: 3.11
PyZMQ: 15.3.0
RAET: Not Installed
smmap: Not Installed
timelib: Not Installed
Tornado: 4.2.1
ZMQ: 4.1.4

System Versions:
dist: redhat 7.5 Maipo
locale: UTF-8
machine: x86_64
release: 3.10.0-862.2.3.el7.x86_64
system: Linux
version: Red Hat Enterprise Linux Server 7.5 Maipo

@Ch3LL
Copy link
Contributor

Ch3LL commented Aug 16, 2018

ping @dmurphy18 can you take a look here?

@dmurphy18
Copy link
Contributor

@josephname The version of M2Crypto in version reports is M2Crypto: 0.21.1 (typically supplied by Redhat)

The version of M2Crypto supplied by Salt with 2018.3.2 is m2crypto-0.28.2-3.el7.x86_64.rpm

Does the issue occur if the version of M2Crypto supplied by Salt ?

@Ch3LL Ch3LL added the Pending-Discussion The issue or pull request needs more discussion before it can be closed or merged label Aug 16, 2018
@Ch3LL Ch3LL added this to the Blocked milestone Aug 16, 2018
@dmurphy18 dmurphy18 self-assigned this Aug 16, 2018
@josephname
Copy link
Author

Yes, the 0.28.2-3 version in the salt repository caused the problem.

@dmurphy18
Copy link
Contributor

@josephname The version from salt-run --versions-report show 0.21 for M2Crypto

Dependency Versions:
cffi: 0.8.6
cherrypy: Not Installed
dateutil: 1.5
docker-py: Not Installed
gitdb: Not Installed
gitpython: Not Installed
ioflo: Not Installed
Jinja2: 2.7.2
libgit2: Not Installed
libnacl: Not Installed
M2Crypto: 0.21.1
Mako: Not Installed
msgpack-pure: Not Installed

The version of M2Crypto from Salt 0.28.2-3 is not installed.

@josephname
Copy link
Author

Yes, we had to fix that server. Sorry for the confusion. You should be able to test and verify that installing m2crypto 0.28.2-3 on a RHEL 7 server will break subscription-manager. We done it on two different servers to verify.

@dmurphy18
Copy link
Contributor

dmurphy18 commented Aug 16, 2018

Using a AWS Redhat 7 server, not seeing issue.

[root@ip-10-0-0-225 ec2-user]# yum list m2crypto*
Loaded plugins: amazon-id, rhui-lb, search-disabled-repos
Installed Packages
m2crypto.x86_64 0.28.2-3.el7 @salt-latest
Available Packages
m2crypto-debuginfo.x86_64 0.28.2-3.el7 salt-latest
[root@ip-10-0-0-225 ec2-user]# yum update
Loaded plugins: amazon-id, rhui-lb, search-disabled-repos
No packages marked for update
[root@ip-10-0-0-225 ec2-user]#

[root@ip-10-0-0-225 ec2-user]# cat /etc/yum.repos.d/salt-latest.repo
[salt-latest]
name=SaltStack Latest Release Channel for RHEL/Centos $releasever
baseurl=https://repo.saltstack.com/yum/redhat/7/$basearch/latest
failovermethod=priority
enabled=1
gpgcheck=1
gpgkey=file:///etc/pki/rpm-gpg/saltstack-signing-key
[root@ip-10-0-0-225 ec2-user]#
[root@ip-10-0-0-225 ec2-user]# salt-run --versions-report
Salt Version:
Salt: 2018.3.2

Dependency Versions:
cffi: Not Installed
cherrypy: Not Installed
dateutil: 1.5
docker-py: Not Installed
gitdb: Not Installed
gitpython: Not Installed
ioflo: Not Installed
Jinja2: 2.7.2
libgit2: Not Installed
libnacl: Not Installed
M2Crypto: 0.28.2
Mako: Not Installed
msgpack-pure: Not Installed
msgpack-python: 0.4.6
mysql-python: Not Installed
pycparser: Not Installed
pycrypto: 2.6.1
pycryptodome: Not Installed
pygit2: Not Installed
Python: 2.7.5 (default, May 31 2018, 09:41:32)
python-gnupg: Not Installed
PyYAML: 3.11
PyZMQ: 15.3.0
RAET: Not Installed
smmap: Not Installed
timelib: Not Installed
Tornado: 4.2.1
ZMQ: 4.1.4

System Versions:
dist: redhat 7.5 Maipo
locale: UTF-8
machine: x86_64
release: 3.10.0-862.el7.x86_64
system: Linux
version: Red Hat Enterprise Linux Server 7.5 Maipo

[root@ip-10-0-0-225 ec2-user]#
[root@ip-10-0-0-225 ec2-user]# cat /etc/redhat-release
Red Hat Enterprise Linux Server release 7.5 (Maipo)
[root@ip-10-0-0-225 ec2-user]#

[root@ip-10-0-0-225 ec2-user]# find / -name "libcrypto*"
/usr/lib64/libcrypto.so.1.0.2k
/usr/lib64/libcrypto.so.10
[root@ip-10-0-0-225 ec2-user]# rpm -qf /usr/lib64/libcrypto.so.10
openssl-libs-1.0.2k-12.el7.x86_64
[root@ip-10-0-0-225 ec2-user]#

Can you provide a salt-run --versions-report from a server which is actually exhibiting the problem ?

@josephname
Copy link
Author

They were RHEL 7.2. And the problem is when you run subscription-manager - how you verify redhat subscriptions. On the system above - run "subscription-manager facts"

@josephname
Copy link
Author

we had to fix our servers and move on, I just thought you would be interested in this. I might be able to re-create it on another server soon. The error from subscription manager is pasted in my first comment.

@dmurphy18
Copy link
Contributor

I noticed that you were using 7.2 but unfortunately the quick way is to use AWS and they gave 7.5. I shall search through AWS to see if I can find a 7.2 install. To help with trying to reproduce this, can you provide the contents of /etc/redhat-release.

Running subscription-manager facts did not appear to have any issues:

[root@ip-10-0-0-225 ec2-user]# subscription-manager facts
cpu.core(s)_per_socket: 1
cpu.cpu(s): 1
cpu.cpu_socket(s): 1
cpu.thread(s)_per_core: 1
cpu.topology_source: kernel /sys cpu sibling lists
distribution.id: Maipo
distribution.name: Red Hat Enterprise Linux Server
distribution.version: 7.5
distribution.version.modifier: ga
dmi.bios.address: 0xe8000
dmi.bios.bios_revision: 4.2
dmi.bios.relase_date: 08/24/2006
dmi.bios.rom_size: 64 KB
dmi.bios.runtime_size: 96 KB
dmi.bios.vendor: Xen
dmi.bios.version: 4.2.amazon
dmi.chassis.asset_tag: Not Specified
dmi.chassis.boot-up_state: Safe
dmi.chassis.lock: Not Present
dmi.chassis.manufacturer: Xen
dmi.chassis.power_supply_state: Safe
dmi.chassis.security_status: Unknown
dmi.chassis.serial_number: Not Specified
dmi.chassis.thermal_state: Safe
dmi.chassis.type: Other
dmi.chassis.version: Not Specified
dmi.memory.array_handle: 0x1000
dmi.memory.assettag: Not Specified
dmi.memory.bank_locator: Not Specified
dmi.memory.data_width: 64 bit
dmi.memory.error_correction_type: Multi-bit ECC
dmi.memory.error_information_handle: Not Provided
dmi.memory.form_factor: DIMM
dmi.memory.location: Other
dmi.memory.locator: DIMM 0
dmi.memory.manufacturer: Not Specified
dmi.memory.maximum_capacity: 1 GB
dmi.memory.part_number: Not Specified
dmi.memory.serial_number: Not Specified
dmi.memory.size: 1024 MB
dmi.memory.speed: (ns)
dmi.memory.total_width: 64 bit
dmi.memory.type: RAM
dmi.memory.use: System Memory
dmi.meta.cpu_socket_count: 1
dmi.processor.family: Other
dmi.processor.socket_designation: CPU 1
dmi.processor.status: Populated:Enabled
dmi.processor.type: Central Processor
dmi.processor.upgrade: Other
dmi.processor.version: Not Specified
dmi.processor.voltage: Unknown
dmi.system.family: Not Specified
dmi.system.manufacturer: Xen
dmi.system.product_name: HVM domU
dmi.system.serial_number: ec280b47-3764-21d3-3833-9f83d5668043
dmi.system.sku_number: Not Specified
dmi.system.status: No errors detected
dmi.system.uuid: EC280B47-3764-21D3-3833-9F83D5668043
dmi.system.version: 4.2.amazon
dmi.system.wake-up_type: Power Switch
lscpu.architecture: x86_64
lscpu.bogomips: 4800.12
lscpu.byte_order: Little Endian
lscpu.core(s)_per_socket: 1
lscpu.cpu(s): 1
lscpu.cpu_family: 6
lscpu.cpu_mhz: 2399.849
lscpu.cpu_op-mode(s): 32-bit, 64-bit
lscpu.flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx rdtscp lm constant_tsc rep_good nopl xtopology eagerfpu pni pclmulqdq ssse3 fma cx16 pcid sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm abm fsgsbase bmi1 avx2 smep bmi2 erms invpcid xsaveopt
lscpu.hypervisor_vendor: Xen
lscpu.l1d_cache: 32K
lscpu.l1i_cache: 32K
lscpu.l2_cache: 256K
lscpu.l3_cache: 30720K
lscpu.model: 63
lscpu.model_name: Intel(R) Xeon(R) CPU E5-2676 v3 @ 2.40GHz
lscpu.numa_node(s): 1
lscpu.numa_node0_cpu(s): 0
lscpu.on-line_cpu(s)_list: 0
lscpu.socket(s): 1
lscpu.stepping: 2
lscpu.thread(s)_per_core: 1
lscpu.vendor_id: GenuineIntel
lscpu.virtualization_type: full
memory.memtotal: 1013900
memory.swaptotal: 0
net.interface.eth0.ipv4_address: 10.0.0.225
net.interface.eth0.ipv4_address_list: 10.0.0.225
net.interface.eth0.ipv4_broadcast: 10.0.0.255
net.interface.eth0.ipv4_broadcast_list: 10.0.0.255
net.interface.eth0.ipv4_netmask: 24
net.interface.eth0.ipv4_netmask_list: 24
net.interface.eth0.ipv6_address.link: fe80::8d2:cdff:fe92:d272
net.interface.eth0.ipv6_address.link_list: fe80::8d2:cdff:fe92:d272
net.interface.eth0.ipv6_netmask.link: 64
net.interface.eth0.ipv6_netmask.link_list: 64
net.interface.eth0.mac_address: 0A:D2:CD:92:D2:72
net.interface.lo.ipv4_address: 127.0.0.1
net.interface.lo.ipv4_address_list: 127.0.0.1
net.interface.lo.ipv4_broadcast: Unknown
net.interface.lo.ipv4_broadcast_list: Unknown
net.interface.lo.ipv4_netmask: 8
net.interface.lo.ipv4_netmask_list: 8
net.interface.lo.ipv6_address.host: ::1
net.interface.lo.ipv6_address.host_list: ::1
net.interface.lo.ipv6_netmask.host: 128
net.interface.lo.ipv6_netmask.host_list: 128
network.fqdn: ip-10-0-0-225.ec2.internal
network.hostname: ip-10-0-0-225.ec2.internal
network.ipv4_address: 10.0.0.225
network.ipv6_address: fe80::8d2:cdff:fe92:d272
proc_cpuinfo.common.address_sizes: 46 bits physical, 48 bits virtual
proc_cpuinfo.common.apicid: 0
proc_cpuinfo.common.bogomips: 4800.12
proc_cpuinfo.common.cache_alignment: 64
proc_cpuinfo.common.cache_size: 30720 KB
proc_cpuinfo.common.clflush_size: 64
proc_cpuinfo.common.core_id: 0
proc_cpuinfo.common.cpu_cores: 1
proc_cpuinfo.common.cpu_family: 6
proc_cpuinfo.common.cpu_mhz: 2399.849
proc_cpuinfo.common.cpuid_level: 13
proc_cpuinfo.common.flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx rdtscp lm constant_tsc rep_good nopl xtopology eagerfpu pni pclmulqdq ssse3 fma cx16 pcid sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm abm fsgsbase bmi1 avx2 smep bmi2 erms invpcid xsaveopt
proc_cpuinfo.common.fpu: yes
proc_cpuinfo.common.fpu_exception: yes
proc_cpuinfo.common.initial_apicid: 0
proc_cpuinfo.common.microcode: 0x3c
proc_cpuinfo.common.model: 63
proc_cpuinfo.common.model_name: Intel(R) Xeon(R) CPU E5-2676 v3 @ 2.40GHz
proc_cpuinfo.common.physical_id: 0
proc_cpuinfo.common.power_management: Unknown
proc_cpuinfo.common.processor: 0
proc_cpuinfo.common.siblings: 1
proc_cpuinfo.common.stepping: 2
proc_cpuinfo.common.vendor_id: GenuineIntel
proc_cpuinfo.common.wp: yes
system.certificate_version: 3.2
system.default_locale: en_US.UTF-8
uname.machine: x86_64
uname.nodename: ip-10-0-0-225.ec2.internal
uname.release: 3.10.0-862.el7.x86_64
uname.sysname: Linux
uname.version: #1 SMP Wed Mar 21 18:14:51 EDT 2018
virt.host_type: xen, xen-hvm, aws
virt.is_guest: True
virt.uuid: ec280b47-3764-21d3-3833-9f83d5668043

WARNING

The yum plugins: /etc/yum/pluginconf.d/subscription-manager.conf, /etc/yum/pluginconf.d/product-id.conf were automatically enabled for the benefit of Red Hat Subscription Management. If not desired, use "subscription-manager config --rhsm.auto_enable_yum_plugins=0" to block this behavior.

[root@ip-10-0-0-225 ec2-user]#

And thanks for bring this to our attention. Most of development is done with Centos 7 but QA does test against Redhat 7 and Centos 7.

@dmurphy18
Copy link
Contributor

dmurphy18 commented Aug 20, 2018

@josephname Have installed RHEL 7.2 Server with GUI from iso, installed Salt 2018.3.2 and did not encounter error running 'subscription-manager facts':

[root@localhost test]# subscription-manager facts
cpu.core(s)_per_socket: 1
cpu.cpu(s): 1
cpu.cpu_socket(s): 1
cpu.thread(s)_per_core: 1
cpu.topology_source: kernel /sys cpu sibling lists
distribution.id: Maipo
distribution.name: Red Hat Enterprise Linux Server
distribution.version: 7.2
distribution.version.modifier: ga
dmi.baseboard.manufacturer: Oracle Corporation
dmi.baseboard.product_name: VirtualBox
dmi.baseboard.serial_number: 0
dmi.baseboard.version: 1.2
dmi.bios.address: 0xe0000
dmi.bios.relase_date: 12/01/2006
dmi.bios.rom_size: 128 KB
dmi.bios.runtime_size: 128 KB
dmi.bios.vendor: innotek GmbH
dmi.bios.version: VirtualBox
dmi.chassis.asset_tag: Not Specified
dmi.chassis.boot-up_state: Safe
dmi.chassis.lock: Not Present
dmi.chassis.manufacturer: Oracle Corporation
dmi.chassis.power_supply_state: Safe
dmi.chassis.security_status: None
dmi.chassis.serial_number: Not Specified
dmi.chassis.thermal_state: Safe
dmi.chassis.type: Other
dmi.chassis.version: Not Specified
dmi.system.family: Virtual Machine
dmi.system.manufacturer: innotek GmbH
dmi.system.product_name: VirtualBox
dmi.system.serial_number: 0
dmi.system.sku_number: Not Specified
dmi.system.uuid: df1f75ba-1835-4986-af7e-6ccb66b061f1
dmi.system.version: 1.2
dmi.system.wake-up_type: Power Switch
lscpu.architecture: x86_64
lscpu.bogomips: 4589.36
lscpu.byte_order: Little Endian
lscpu.core(s)_per_socket: 1
lscpu.cpu(s): 1
lscpu.cpu_family: 6
lscpu.cpu_mhz: 2294.684
lscpu.cpu_op-mode(s): 32-bit, 64-bit
lscpu.hypervisor_vendor: KVM
lscpu.l1d_cache: 32K
lscpu.l1i_cache: 32K
lscpu.l2_cache: 256K
lscpu.l3_cache: 6144K
lscpu.model: 60
lscpu.model_name: Intel(R) Core(TM) i7-4712HQ CPU @ 2.30GHz
lscpu.numa_node(s): 1
lscpu.numa_node0_cpu(s): 0
lscpu.on-line_cpu(s)_list: 0
lscpu.socket(s): 1
lscpu.stepping: 3
lscpu.thread(s)_per_core: 1
lscpu.vendor_id: GenuineIntel
lscpu.virtualization_type: full
memory.memtotal: 2049052
memory.swaptotal: 2097148
net.interface.enp0s3.ipv4_address: 10.4.237.234
net.interface.enp0s3.ipv4_broadcast: 10.4.255.255
net.interface.enp0s3.ipv4_netmask: 16
net.interface.enp0s3.ipv6_address.link: fe80::a00:27ff:fed9:83e4
net.interface.enp0s3.ipv6_netmask.link: 64
net.interface.enp0s3.mac_address: 08:00:27:D9:83:E4
net.interface.lo.ipv4_address: 127.0.0.1
net.interface.lo.ipv4_broadcast: Unknown
net.interface.lo.ipv4_netmask: 8
net.interface.lo.ipv6_address.host: ::1
net.interface.lo.ipv6_netmask.host: 128
net.interface.virbr0-nic.mac_address: 52:54:00:E9:A6:FD
net.interface.virbr0.ipv4_address: 192.168.122.1
net.interface.virbr0.ipv4_broadcast: 192.168.122.255
net.interface.virbr0.ipv4_netmask: 24
net.interface.virbr0.mac_address: 52:54:00:E9:A6:FD
network.hostname: localhost.localdomain
network.ipv4_address: 127.0.0.1
network.ipv6_address: ::1
system.certificate_version: 3.2
uname.machine: x86_64
uname.nodename: localhost.localdomain
uname.release: 3.10.0-327.el7.x86_64
uname.sysname: Linux
uname.version: #1 SMP Thu Oct 29 17:29:29 EDT 2015
virt.host_type: virtualbox, kvm
virt.is_guest: True
virt.uuid: df1f75ba-1835-4986-af7e-6ccb66b061f1
[root@localhost test]#

Did encounter issues installing from repo.saltstack.com for the following:
python-markupsafe
libyaml
python-jinja2

where getting Header V3 RSA/SHA256 signature errors for NOKEY, which is strange especially since these come base sub-directory which are from the Centos 7.0 ISO and are only provided as a convenience for some users of Salt.

From the build master:
[root@qa-master latest]# rpm --checksig -v base/libyaml-0.1.4-11.el7_0.x86_64.rpm
base/libyaml-0.1.4-11.el7_0.x86_64.rpm:
Header V3 RSA/SHA256 Signature, key ID f4a80eb5: OK
Header SHA1 digest: OK (199a4ae32c478d6b3255b183d3ceac8993a67d24)
V3 RSA/SHA256 Signature, key ID f4a80eb5: OK
MD5 digest: OK (c696db7f90227831c1b92122a69d3794)
[root@qa-master latest]#

@josephname
Copy link
Author

I don't know if it make a difference, but this was on a physical server, not virtual.

@damon-atkins
Copy link
Contributor

Still think you should separate rpm into Support by Salt and dependencies, and anything in dependencies is recommended (so the dependencies repo does not have to be used).

Keep in mind RHEL RPM may be built against code which is not the same available in open source version. As RHEL back port fixes they want, result in opensource version 1.1 not being the same as RHEL 1.1 of the same software.

@dmurphy18
Copy link
Contributor

@josephname There should not be a difference between physical and virtual servers. I have not encountered any issues with using VirtualBox VM images compared to a physical server in my years of using it. Again I request the output of --versions-report from the server exhibiting the problem, also can you include the output of /etc/redhat-release and 'yum list openssl*'.

@dmurphy18
Copy link
Contributor

@damon-atkins Splitting Salt and dependencies into different repo's will take time, especially given the number of platforms to support, and with only just over a year for Python 2 before official EOL, something to be considered for Python 3 support.

@josephname
Copy link
Author

For now, we just removed m2crypto from the salt repo on our satellite server. salt seems to work with any of the versions we've used, as long as they come from redhat. I will see if I can re-create it on another server.

@dmurphy18
Copy link
Contributor

@josephname In the interests of trying to resolve this issue for you and others that may encountered it, could you provide the information I requested, thanks.

@dmurphy18
Copy link
Contributor

@josephname Never mind, just managed to duplicate the issue, will debug and attempt to resolve the issue

@josephname
Copy link
Author

good deal. thank you for your patience.

@josephname
Copy link
Author

Here was the info
$ cat /etc/redhat-release
Red Hat Enterprise Linux Server release 7.2 (Maipo)

Salt Version:
Salt: 2018.3.2

Dependency Versions:
cffi: 0.8.6
cherrypy: Not Installed
dateutil: 1.5
docker-py: Not Installed
gitdb: Not Installed
gitpython: Not Installed
ioflo: Not Installed
Jinja2: 2.7.2
libgit2: Not Installed
libnacl: Not Installed
M2Crypto: 0.21.1
Mako: Not Installed
msgpack-pure: Not Installed
msgpack-python: 0.5.6
mysql-python: Not Installed
pycparser: 2.14
pycrypto: 2.6.1
pycryptodome: Not Installed
pygit2: Not Installed
Python: 2.7.5 (default, Oct 11 2015, 17:47:16)
python-gnupg: Not Installed
PyYAML: 3.11
PyZMQ: 15.3.0
RAET: Not Installed
smmap: Not Installed
timelib: Not Installed
Tornado: 4.2.1
ZMQ: 4.1.4

System Versions:
dist: redhat 7.2 Maipo
locale: UTF-8
machine: x86_64
release: 3.10.0-327.el7.x86_64
system: Linux
version: Red Hat Enterprise Linux Server 7.2 Maipo

yum list openssl
Loaded plugins: enabled_repos_upload, langpacks, package_upload, product-id, search-disabled-repos, subscription-manager
Installed Packages
openssl.x86_64 1:1.0.1e-42.el7_1.9 @anaconda/7.2
Uploading Enabled Repositories Report
Loaded plugins: langpacks, product-id

@dmurphy18
Copy link
Contributor

@josephname The problem appears to be the version of openssl that Salt's M2Crypto was built with
openssl.x86_64 1:1.0.2k-8.el7, which is newer than that on Redhat 7.2. I have tested the functionality on RHEL 7.5 with openssl openssl.x86_64 1:1.0.2k-12.el7 and encountered no issue. The latest point releases for Salt are scheduled for the end of this month; I shall ensure that all dependency packages are built such that they are compatible with RHEL 7.2. Thank you for bringing this issue to SaltStack's attention.

@josephname
Copy link
Author

most excellent, thank you for your quick and valuable solution!

@dmurphy18
Copy link
Contributor

Packaging issues are dealt with in salt-pack repo, see vmware-archive/salt-pack#580

@josephname If this satisfies your issue, please consider closing our alternatively after the next point release.

@josephname
Copy link
Author

when it's released, should I use https://repo.saltstack.com/yum/redhat/7Server/x86_64/2018.4 ?

@dmurphy18
Copy link
Contributor

dmurphy18 commented Aug 23, 2018

@josephname The next point release for 2018.3 branch shall be called 2018.3.4.

The link https://repo.saltstack.com/yum/redhat/7Server/x86_64/2018.3 is updated to always point to the newest release for that branch. If you want to remain on a particular point release, the url would be https://repo.saltstack.com/yum/redhat/7Server/x86_64/archive/2018.3.4 for the 2018.3.4. point release.

Hope this clarifies for you.

@dmurphy18
Copy link
Contributor

openssl versions differ between RHEL 7.2 and Centos 7.2
RHEL 7.2. 1:1.0.1e-42.el7_1.9
Centos 7.2.1511 1:1.0.2k-12.el7

@dmurphy18
Copy link
Contributor

This should be resolved in the next point release for branch 2018.3

@mattp-
Copy link
Contributor

mattp- commented Oct 2, 2018

@dmurphy18 will this be resolved for 2018.3.3 ? or 2018.3.4 as you mentioned above

@dmurphy18
Copy link
Contributor

@mattp- It is resolved for Salt 2018.3.3 which should be released soon.

@dmurphy18
Copy link
Contributor

@mattp- With the recent point releases of 2018.3.4 and 2019.2.0 can you check if this is still an issue
Thanks

@dmurphy18 dmurphy18 added the info-needed waiting for more info label Mar 12, 2019
@josephname
Copy link
Author

unfortunately, we don't have any RHEL 7.2 servers any longer. Other factors drove us to update to RHEL 7.5. Also, we started ignoring the version of m2crypto included in the salt repositories, and use only the m2crypto in redhat repositories. We've not had any issues related to m2crypto since then. Thank you for your perseverance and attention to this issue.

@dmurphy18
Copy link
Contributor

@josephname Sorry to hear that.
FYI m2crypto provided by SaltStack is now v0.31.0, and the point releases provide support for a later version of openssl similar to the version available from RHEL 7.5, in case of installation on RHEL 7.2.

The different version of openssl between CentOS 7.2 and RHEL 7.2 was a rare case of divergence between the two versions.

If you consider this issue resolved, can you consider closing it, or if there is any other issue related to this, can you let me know and I shall endeavor to resolve it.

@josephname
Copy link
Author

Yes, I will close it. Sorry if I caused confusion or trouble.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
info-needed waiting for more info Pending-Discussion The issue or pull request needs more discussion before it can be closed or merged
Projects
None yet
Development

No branches or pull requests

5 participants