Hello,
While developing a crawler I realized that some websites that can be
displayed by Chrome and Firefox would throw SSL errors on libcurl requests
Exs:
SSL connect error
SSL routines::unsafe legacy renegotiation disabled
SSL peer certificate or SSH remote key was not OK
Therefore I had the idea of adding the certificates used by firefox to
prevent that.
I downloaded the cacert.pem file from
https://curl.se/docs/caextract.html and added it to the ca store with
the following commands:
$ openssl x509 -in cacert-Mozzila.pem -out cacert-Mozzila.crt
$ sudo cp cacert-Mozzila.crt /usr/local/share/ca-certificates
$ sudo update-ca-certificates
However those sites continue to throw dose errors. What have I missed to
be able to get the same responses as firefox?
--
Cumprimentos,
Luis Figueira
--
Unsubscribe: https://lists.haxx.se/mailman/listinfo/curl-library
Etiquette: https://curl.se/mail/etiquette.html