W: Invalid ‘Date’ entry in Release file /var/lib/apt/lists/_var_nv-gie-repo-ga-cuda8.0-gie1.0-20170116_Release
W: GPG error: file:/var/libopencv4tegra-repo Release: The following signatures couldn’t be verified because the public key is not available: NO_PUBKEY D88C3D385C37D3BE
W: The repository ‘file:/var/libopencv4tegra-repo Release’ is not signed.
N: Data from such a repository can’t be authenticated and is therefore potentially dangerous to use.
N: See apt-secure(8) manpage for repository creation and user configuration details.
W: Invalid ‘Date’ entry in Release file /var/lib/apt/lists/_var_libopencv4tegra-repo_Release
Looks like two of them: an invalid date and an unsigned repo. How do I fix this?
Thanks LLL74 that worked for me as well. Some research on the ‘Date’ entry revealed the following:
The solution is to tell the owners of the respective repositories to fix their Release file(s). The Date (and Valid-Until) field MUST be in UTC (aka GMT, Z, +0000). Earlier apt versions accepted other timezones silently, but parsed it as UTC anyhow which could cause all kinds of fun.
I will look at doing this on my TX2 tonight. On the Date issue, I am assuming this “telling the owners of the respective repositories” is something Nvidia L4T team need to take care of, since I got these errors on a virgin install of R27.1 therefore none of the repositories was added by me.
It is my view that any Debian or Ubuntu-based virgin install supplied by a big manufacturer such as Nvidia, on their own hardware, should work 100% correctly right out of the gate on something as mainstream as apt-get update. This is especially true given the (admittedly awesome) hardware is $599, when similar hardware is a third of the price in Shield form and that works 100% right out of the gate with no problems at all.
Not that this is a big issue at all - everything seems to work fine - but there is definitely a whiff of “uh-oh” for anybody who has used apt-get for a long time and starts getting these types of warnings and errors. It’s a confidence / perception thing.
Or maybe we’re never supposed to apt-get update/upgrade in the first place? Relying only on new jetpack releases for updates? Please advise. This would be “interesting” if it were the case but it would be good to know.
Just one more point: the apt-get update problem does not only occur on the Jetson TX2, but also on the host computer on which Jetson was first loaded before being flashed across.
Yes, I am facing the same problem. Initially I was using ubuntu 1604 host computer, but today I made a new install of 14.04 and downloaded the JetPack and run it.
After installing the ability to apt-get update on the nvidia tx2 is broken.
And the host computer totally gets trashed. It wont apt-get update, not only the original problem with the release date file but also, it is not able to find packages under …/binary-arm64/Packages
I changed few mirrors, but had the same result. The mirrors have the binary-arm64 folders removed. Maybe this is because it is ubuntu 14.04, and they are partially deleting mirrors.
Hello,
this seems pretty major to me.
Two questions:
are we supposed to use apt-get upgrade on the TX2: 27.1 ?
I applied ‘sudo apt-mark hold xserver-xorg-core’.
And the Jetpack really really very much must not affect the host’s ability to use apt-get update!
This is like the key assumption of what programs do NOT do when you allow them root rights.
Is this happening on Ubuntu 14.04? There is a similar issue people are experiencing on Ubuntu 16.04 that breaks all system updates.
According to the JetPack installer download page “The JetPack installer requires Ubuntu 14.04 on your host PC”. I was going to create a 14.04 image to work around the issue, but if it’s broken in 14.04 then I’ll just have to wait for a fix.
Unfortunately, NVIDIA will not be able to deliver the fix via updates … oops!
Also the download page mentions 16.04 in the table, so it’s a little confusing then to see lower down, 14.04 requirement for the host, and that particular line is easy to miss, especially that most people are now really used to using 16.04 as the LTS. I mean 14.04 will be defunct in less than 2 years so Nvidia should really move the entire stack to 16.04.
Also the download page mentions 16.04 in the table, so it’s a little confusing then to see lower down, 14.04 requirement for the host, and that particular line is easy to miss, especially that most people are now really used to using 16.04 as the LTS. I mean 14.04 will be defunct in less than 2 years so Nvidia should really move the entire stack to 16.04.
Also the download page mentions 16.04 in the table, so it’s a little confusing then to see lower down, 14.04 requirement for the host, and that particular line is easy to miss, especially that most people are now really used to using 16.04 as the LTS. I mean 14.04 will be defunct in less than 2 years so Nvidia should really move the entire stack to 16.04.
Did you do that on a host?
If so, is the host an ARM-based system?
That error message is saying that the host you’re running dpkg on is arm64 based.
The “solution” is intended for an x86_64-based host system, where instead I get this warning:
jwatte@ub16:/usr/local/data$ sudo dpkg --force-architecture --remove-architecture arm64
dpkg: warning: removing architecture 'arm64' currently in use by database
But that seems OK, because at least I can update other packages on the host …
Ugh, just managed to toast my whole Ubuntu desktop machine trying to fix this. Was tired and just wanted to build a torch app and was getting apt errors dealing with arm64 or was it amd64? Yeah, I think that was the problem :) So, I just started to force it and somehow managed to get it to start removing all the packages instead of the ARM64. Caught it halfway thorough but it was too late. So, word of warning to be careful when trying to fix this.