I’ve noticed that our scans can take upwards of 10 mins and sometimes timeout. We use a non-vendor configuration and have private dependencies.
Do scans have additional costs associated to the dependencies? I noticed by breaking one dependency with a private repo that our scan went frmom 10+ mins (or timeout) to 4 mins.
This seems like unnecessary scan overhead as I would expect dependent private repo to already have been scanned.
So ultimately my question is, scan supposed to scan my go module and dependencies or just my go module for my source code?
We only scan your code, but require your dependencies to build type information which helps us with analysis. In case of private dependencies, the go module installer clones the repository directly instead of hitting the module cache which increases analysis time.
So what would cause fluctuation in scan times from 10+ mins, 4+ min and 1+ min for a repository. Keep in mind this is a new repo we just added as a scan
First analysis run can take some time. Other than that, a freshly created PR run may also take some time. The fluctuation you see may depend on how much time it takes to install the dependencies, and build an import graph for us to start the analysis.
Are there any metrics you can provide that would help us understand when it takes an abnormal amount of time? It seems unreasonable for 1+ mins vs 10+ mins to account for all this extra time.
Are import graphs cached? Do they get rebuilt in certain conditions? How can we see the download dependencies portion of the time to know the extra time is there?
Hey can you provide your account name and repository? With that we can gather exact details, let you know about the time spent, and investigate further.