Moodle Server Crawler / Downloader

Web crawler for downloading all resources from Moodle servers.

Your school is temporary shutting down the moodle server for whatever reason? No worries! MoodleCrawler for the rescue ;) This is a simple single threaded moodle crawler GUI application that will help you downloading all resources that are attached by your teachers. It will save all resources in the directory structure as shows in the moodle server and also download some of the pages as HTML format.

Written in C# using Windows Forms.


The only thing you need to do is to the set the UrlMoodleRoot settings of your moodle server in your configuration file as follows:

<?xml version="1.0" encoding="utf-8"?>
        <sectionGroup name="userSettings" type="System.Configuration.UserSettingsGroup, System, Version=, Culture=neutral, PublicKeyToken=b77a5c561934e089" >
            <section name="Moodle.WebCrawler.Properties.Settings" type="System.Configuration.ClientSettingsSection, System, Version=, Culture=neutral, PublicKeyToken=b77a5c561934e089" allowExeDefinition="MachineToLocalUser" requirePermission="false" />
        <supportedRuntime version="v4.0" sku=".NETFramework,Version=v4.7"/>
            <setting name="UrlMoodleRoot" serializeAs="String">

NOTE: If you are building from source, right click on the project Moodle.WebCrawler -> Properties -> Settings and then change the value to your school url.

I would also highly recommend running in debug mode using VS since it’s a single threaded application and I didn’t take the time to make the GUI to display the requests activity, so you’ll see all activity in the Output window of VS. You could also compile the project as console application so while you use the GUI, the requests will be outputted in the console window. Other optional tools can be used during crawling like Wireshark or Fiddler to see the requests being sent in the background.


My school is shutting down the moodle server for some time 10 days before our final exam, therefore I though it wouldn’t take much time and effort to crawl all that data on the server for an offline version. Also, sometimes you don’t have internet access when you’re on the ways, so it’s always good to have a backup.

© 2019 - All rights reserved.