To get past Google Drive's Download Quota

  1. Make a "copy" of the file to your G Drive

  2. Make a folder name, put file in it

  3. Download folder.

  4. Profit?!!

:-)

take my upvote, you absolute legend!

I'm so confused on how to use this thread

Click the links go to the page pick what you want click the corresponding link and download.

For everyone who can't download from the archive.org, make sure that you have made an account on the archive.org.

The archive.org locks out guest users when collections use a large percentage of their Internet bandwith.

I just joined this subreddit. Where do I find the roms. I am confused.

Are there also coming romhacks?

romhack collections would be extremely nice (also fantranslations)

I couldn't believe that there are actually people who couldn't find the link in the megathread lmfao

For downloading a lot of ROM's at once off of the Archive.org site, I created the following PowerShell script that will go to the link in the $rooturl variable and grab any links on the page that follow the criteria (after where-object, make sure to modify to suit your needs) and compile a list of links to use to download the ROM's on that page. It'll then place a file on your desktop called "archiveorglinks.txt" with all of the links of the ROM's, that you can then either go through and either pick out ones you want or select all of them and then load into "Free Download Manager" to handle the downloads for you. You get faster downloads this way and can leave it alone while it does its thing. I hope this is helpful :)

$rooturl = "https://archive.org/download/nointro.n64/" #change to archive's root directory, ensure trailing slash exists

$links = (Invoke-WebRequest -Uri $rooturl).Links |

Where-Object {($_.innerHTML -ne "View Contents") -and ($_.href -notlike "*Europe*") -and ($_.href -notlike "*Japan*") -and ($_.href -notlike "*Germany*") -and ($_.href -notlike "*France*") -and ($_.href -like "*.7z")} |

Select-Object -ExpandProperty href

$URLs = @()

$desktop = [Environment]::GetFolderPath("Desktop")

$savefile = "$desktop\archiveorglinks.txt"

foreach ($link in $links){

$URLs += $rooturl + $link

}

$URLs | Out-File -FilePath $savefile

*EDIT*
If you come across an archive that doesn't have 7zip files, replace the last filter "($_.href -like "*.7z")" with the appropriate extension of the archive files, ie. "($_.href -like "*.zip")" for zip files.