Start of public OpenTitan development history
Code contributors:
Alex Bradbury <asb@lowrisc.org>
Cindy Chen <chencindy@google.com>
Eunchan Kim <eunchan@google.com>
Gaurang Chitroda <gaurangg@google.com>
Mark Hayter <mark.hayter@gmail.com>
Michael Schaffner <msf@google.com>
Miguel Osorio <miguelosorio@google.com>
Nils Graf <nilsg@google.com>
Philipp Wagner <phw@lowrisc.org>
Pirmin Vogel <vogelpi@lowrisc.org>
Ram Babu Penugonda <rampenugonda@google.com>
Scott Johnson <scottdj@google.com>
Shail Kushwah <kushwahs@google.com>
Srikrishna Iyer <sriyer@google.com>
Steve Nelson <Steve.Nelson@wdc.com>
Tao Liu <taliu@google.com>
Timothy Chen <timothytim@google.com>
Tobias Wölfel <tobias.woelfel@mailbox.org>
Weicai Yang <weicai@google.com>
diff --git a/doc/ug/design.md b/doc/ug/design.md
new file mode 100644
index 0000000..1766277
--- /dev/null
+++ b/doc/ug/design.md
@@ -0,0 +1,165 @@
+# Design Methodology within OpenTitan
+
+The design methodology within OpenTitan combines the challenges of industry-strength design methodologies with open source ambitions.
+When in conflict, quality must win, and thus we aim to create a final design product that is equal to the quality required from a full production silicon chip tapeout.
+
+## Language and Tool Selection
+
+Starting with the language, the strategy is to use the SystemVerilog language, restricted to a feature set described by our
+[Verilog Style Guide](https://github.com/lowRISC/style-guides/blob/master/VerilogCodingStyle.md).
+All IP should be developed and delivered under the feature set described by this style guide.
+Inconsistencies or lack of clarity within the style guide should be solved by filing and helping close an issue on the style guide in the
+[lowrisc/style-guides GitHub repo](https://github.com/lowRISC/style-guides).
+
+For professional tooling, the team has chosen several industry-grade tools for its design signoff process.
+Wherever possible we attempt to remain tool-agnostic, but we must choose a selection of tools as our ground truth for our own confidence of signoff-level assurances.
+As a project we promote other open source methodologies and work towards a future where these are signoff-grade.
+The discussions on how the design tools are used and which ones are chosen are given below in separate sections.
+
+## Comportability and the Importance of Architectural Conformity
+
+The OpenTitan program is adopting a design methodology aimed at unifying as much as possible the interfaces between individual designs and the rest of the SOC.
+These are detailed in the [Comportability Specification](../rm/comportability_specification.md).
+This document details how peripheral IP interconnects with the embedded processor, the chip IO, other designs, and the security infrastructure within the SOC.
+Not all of the details are complete at this time, but will be tracked and finalized within that specification.
+
+TODO: briefly discuss key architectural decisions, and how we came to the conclusion, with pointers to more thorough documentation. List?
+* Processor/RISC-V strategy
+* Bus strategy
+* Reset strategy
+
+## Defining Design Complete: milestones and tracking
+
+Designs within the OpenTitan project come in a variety of completion status levels.
+Some designs are “tapeout ready” while others are still a work in progress.
+Understanding the status of a design is important to gauge the confidence in its advertised feature set.
+To that end, we’ve designated a spectrum of design milestones in the *OpenTitan IP project tracking* <Coming Soon&rt; document.
+This document defines the design milestones and references where one can find the current status of each of the designs in the repository.
+
+## Documentation
+
+Documentation is a critical part of any design methodology.
+Within the OpenTitan project there are two important tooling components to efficient and effective documentation.
+The first is the [docgen](../../util/docgen/README.md) tool, which converts an annotated markdown file into a rendered HTML file (including this document).
+See the linked docgen specification for information about the annotations and how to use it to create enhanced auto-generated additions to standard Markdown files.
+The second is the [reggen](../rm/register_tool.md) register tool that helps define the methodology and description language for specifying hardware registers.
+These descriptions are fed into docgen through annotations and ensure that the technical specifications for the IP are accurate and up to date with the hardware being built.
+
+Underlying and critical to this tooling is the human-written content that goes into the source markdown and register descriptions.
+Clarity and consistency is key, and we will add guidelines for technical specification documentation flow. (TODO).
+
+## Usage of Register Tool
+
+One design element that is prime for consistent definitions and usages is in the area of register definitions.
+Registers are critical, being at the intersection of hardware and software, where uniformity can reduce confusion and increase reusability.
+The [register tool](../rm/register_tool.md) used within OpenTitan is custom for the project’s needs, but flexible to add new features as they arise.
+It attempts to stay lightweight yet solve most of the needs in this space.
+The description language (using HJSON format) described within that specification also details other features described in the
+[Comportability Specification](../rm/comportability_specification.md).
+See those two specifications as well as the [Markdown Style Guide](../rm/markdown_usage_style.md) for details on the tool and the description language.
+
+## Linting Methodology
+
+Linting is a productivity tool for designers to quickly find typos and bugs at the time when the RTL is written.
+Capturing fast and efficient feedback on syntactic and semantic (as well as style) content early in the process proves to be useful for high quality as well as consistent usage of the language.
+Running lint is especially useful with SystemVerilog, a weakly-typed language, unlike more modern hardware description languages.
+Running lint is faster than running a simulation.
+
+The tool [AscentLint](https://www.realintent.com/rtl-linting-ascent-lint) from the company Real Intent was chosen for this project.
+It has the benefit of fast execution times, and provides a list of concise lint errors and warnings.
+It is understandable that not all partner members will have access to this tool.
+The project will use AscentLint as its linting sign-off tool, and results will be shared in some form through continuous integration build results, published tool outputs, pre-submit checks, and/or linting summaries of tool output (final decision TBD).
+For partners without access to this tool, the recommendation is to run their code through whatever linting tool they have available at their disposal before creating a design Pull Request, then work with the maintainers of the linting sign-off methodology to close linting errors.
+(TODO: decide on available pre-submit linting options).
+Linting errors and warnings can be closed by fixing the code in question (preferred), or waiving the error.
+
+Due to the proprietary nature of this particular linting tool, content towards running the tool can not be checked in in an open source repository.
+In the current state of the project, all lint scripts, policy files, and waivers are **not** provided, but are being kept privately until we can suggest a workable open source solution.
+When this methodology is finalized the details will be given here. (TODO)
+See the [Linting README](../../hw/lint/README.md) for details on how the tool is being run internally.
+This shows the methodology that we are aiming to release in a fully open manner, but for now is being run, with results shared among partner members.
+
+Goals for linting closure per design milestone are given in the *OpenTitan IP project tracking* <Coming Soon> document.
+
+## Assertion Methodology
+
+The creation and maintenance of assertions within RTL design code is an essential way to get feedback if a design is being used improperly.
+Common examples include asserting that a full FIFO should never be written to, a state machine doesn’t receive an input while in a particular state, or two signals should remain mutually exclusive.
+Usually these will eventually result in a downstream error (incorrect data, bus collisions, etc) but early feedback at the first point of inconsistency gives designers and verifiers alike fast access to easier debug.
+
+Within OpenTitan we attempt to maintain uniformity in assertion style and syntax using SystemVerilog Assertions and a list of common macros.
+An overview of the included macros and how to use them is given in this
+[Design Assertion README file](../../hw/formal/README.md).
+This document also describes how to formally verify assertions using
+[JasperGold](https://www.cadence.com/content/cadence-www/global/en_US/home/tools/system-design-and-verification/formal-and-static-verification/jasper-gold-verification-platform/formal-property-verification-app.html)
+from the company Cadence.
+
+## CDC Methodology
+
+Logic designs that have signals that cross from one clock domain to another unrelated clock domain are notorious for introducing hard to debug problems.
+The reason is that design verification, with its constant and unrealistic timing relationships on signals, does not represent the variability and uncertainty of real world systems.
+For this reason, maintaining a robust Clock Domain Crossing verification strategy ("CDC methodology") is critical to the success of any mutli-clock design.
+
+Our general strategy is threefold:
+maintain a list of proven domain crossing submodules;
+enforce the usage of these submodules;
+use a production-worthy tool to check all signals within the design conform to correct crossing rules.
+This *CDC Methodology document* <Coming Soon> gives details on the submodules, discusses the tool chosen and how to run it, and explains more rationale for the designs chosen.
+
+The tool chosen for this program is
+[Meridian]([https://www.realintent.com/clock-domain-crossing-meridian-cdc](https://www.realintent.com/clock-domain-crossing-meridian-cdc))
+from RealIntent.
+It is a sign-off-grade CDC checking tool that provides the features needed for CDC assurance.
+It is understandable that not all partner members will have access to this tool.
+The project will use it as its sign-off tool, and results will be shared in some form (final decision TBD).
+CDC checking errors can be closed by fixing the code in question (preferred), or waiving the error.
+All CDC waivers will be reviewed as part of the Pull Request review process.
+See the *CDC README* <Coming Soon> for details on how to run the tool if you have a Meridian license.
+
+Similar to the linting tool, due to the proprietary nature of the CDC tool, some content towards running the tool can not be checked in in an open source repository.
+For those items, the tool provider will be giving us a method to check in encrypted content that still allows for full functionality without exposing their tool’s feature set.
+When this methodology is finalized the details will be given here. (TODO)
+
+## DFT
+
+Design For Testability is another critical part of any design methodology.
+It is the preparation of a design for a successful manufacturing test regime.
+This includes, but is not limited to, the ability to use scan chains for testing digital logic;
+the optimization of design logic to allow maximum access of test logic for fault coverage;
+the ability to observe and control memory cells and other storage macros;
+the control of analog designs and other items that are often outside the reach of test logic;
+built in self test (BIST) insertion for logic and memories.
+In this context, our primary concern at this stage is what impact does this have on the RTL that makes up the IP in our library.
+
+DFT in OpenTitan is particularly interesting for two primary reasons:
+the RTL in the OpenTitan repository is targeted towards an FPGA implementation, but must be prepared for a silicon implementation
+(see the FPGA vs Silicon discussion in the [OpenTitan Product](../../doc/product.md) document);
+the whole purpose of a DFT methodology is full and efficient access to all logic and storage content,
+while the whole purpose of a security microcontroller is restricting access to private secured information.
+In light of the latter dilemma, special care must be taken in a security design to ensure DFT has access at only the appropriate times, but not while in use in production.
+
+At this time the DFT methodology for OpenTitan is not finalized.
+The expectation is that the RTL collateral will undergo a DFT introduction -
+likely with the propagation of such signals as `testmode`, `scanmode`, `bistmode`, etc -
+at a stage before final project completion.
+At this point there are a few references to such signals but they are not yet built into a coherent whole.
+At that future time the DFT considerations will be fully documented and carried out throughout all IP.
+
+## Generated Code
+
+The OpenTitan project contains a lot of generated code through a variety of methods.
+Most modern SystemVerilog-based projects work around the weaknesses in the language in such a way.
+But our first goal is to take full advantage of the language as much as possible, and only resort to generated code where necessary.
+
+At the moment, all generated code is checked in with the source files.
+The pros and cons of this decision are still being discussed, and the decision may be reversed, to be replaced with a master build-all script to prepare a final design as source files changed.
+Until that time, all generated files (see for example the output files from the
+[register generation tool](../rm/register_tool.md))
+are checked in.
+There is a master build file in the repository under `hw/Makefile` that builds all of the `regtool` content.
+This is used by an Azure Pipelines presubmit check script to ensure that the source files produce a generated file that is identical to the one being submitted.
+
+## Getting Started with a Design
+
+The process for getting started with a design involves many steps, including getting clarity on its purpose, its feature set, ownership assignments, documentation, etc.
+These are discussed in the *Getting Started with a Design* <Coming Soon> document that is still being developed.
diff --git a/doc/ug/fpga_boards.md b/doc/ug/fpga_boards.md
new file mode 100644
index 0000000..70b38b1
--- /dev/null
+++ b/doc/ug/fpga_boards.md
@@ -0,0 +1,20 @@
+# Get an FPGA Board
+
+FPGA boards come at different price points, with the price being a good indicator for how much logic the FPGA can hold.
+The following sections give details of how to obtain our supported FPGA boards.
+
+## Nexys Video
+
+The Nexys Video board features a Xilinx Artix 7 XC7A200T FPGA.
+The board is produced by Digilent.
+
+### Ordering
+
+You can get the Nexys Video board from the [Digilent web store](https://store.digilentinc.com/nexys-video-artix-7-fpga-trainer-board-for-multimedia-applications/).
+Significant discounts are available for academic users (including students), see the [Digilent academic price list](https://reference.digilentinc.com/_media/sales-resources/academic_prices.pdf) for details.
+
+Most large electronics distributors like Farnell/element14, RS Components, Mouser or Digikey list the board as well.
+
+### Notes
+
+* The board ships with a US and European power plug. Users in other countries need to use or order an adapter, e.g. [this one from the Digilent store](https://store.digilentinc.com/european-uk-wall-plug-adapter/) (but any travel adapter works).
diff --git a/doc/ug/getting_started.md b/doc/ug/getting_started.md
new file mode 100644
index 0000000..ead4706
--- /dev/null
+++ b/doc/ug/getting_started.md
@@ -0,0 +1,28 @@
+# Getting started
+
+Welcome!
+
+This guide helps you to get started with the lowRISC Comportable chip designs.
+
+## Conventions in this guide
+
+This guide uses the environment variable `$REPO_TOP` to refer to the top-level of the git source tree.
+The master tree is held on GitHub, this should be forked to user trees from which Pull Requests can be made.
+There is a set of [Notes for using GitHub](github_notes.html).
+
+## Setup
+
+You can either follow the [install instructions](install_instructions.md) from start to end to install all software required to simulate the design with Verilator and build a bitstream for an FPGA with Xilinx Vivado or check the corresponding [design description](getting_started.md#choose-a-design-to-build) for install requirements.
+
+## Choose a design to build
+
+The code base contains multiple top-level designs, which can be synthesized or compiled for different targets.
+A target can be a specific FPGA board, an ASIC technology, or a simulation tool.
+The hardware you need to obtain and the tools you need to install depend on the chosen top-level design and the target.
+
+In order to continue, choose a system from the [List of Systems](/doc/ug/system_list.html).
+Read the design documentation for the requirements on the specific design/target combination, and then follow the appropriate steps below.
+
+* [Build software](getting_started_sw.html)
+* [Getting started with Verilator](getting_started_verilator.html)
+* [Getting started on FPGAs](getting_started_fpga.html)
diff --git a/doc/ug/getting_started_fpga.md b/doc/ug/getting_started_fpga.md
new file mode 100644
index 0000000..04495be
--- /dev/null
+++ b/doc/ug/getting_started_fpga.md
@@ -0,0 +1,197 @@
+# Getting started on FPGAs
+
+Do you want to try out the lowRISC chip designs, but don't have a couple thousand or million dollars ready for an ASIC tapeout?
+Running lowRISC designs on an FPGA board can be the answer!
+
+## Prerequisites
+
+To use the lowRISC Comportable designs on an FPGA you need two things:
+
+* A supported FPGA board
+* A tool from the FPGA vendor
+
+Depending on the design/target combination that you want to synthesize you will need different tools and boards.
+Refer to the design documentation for information what exactly is needed.
+
+* [Obtain an FPGA board](fpga_boards.html)
+
+Follow the install instructions to [prepare the system](install_instructions.md#system-preparation) and to install the [software development tools](install_instructions.md#software-development) and [Xilinx Vivado](install_instructions.md#xilinx-vivado).
+
+## Create an FPGA bitstream
+
+Synthesizing a design for a FPGA board is done with the following commands.
+
+The FPGA build will pull in a program to run from the internal
+SRAM. This is pulled in from the `sw/hello_world directory` (see the
+`parameters:` section of the `top_earlgrey_nexysvideo.core` file). At
+the moment there is no check that the `hello_world.vmem` is up to
+date, so it is best to follow the instructions to [Build
+software](getting_started_sw.html) and run `make` to check the vmem is
+up to date before starting an FPGA build.
+
+In the following example we synthesize the Earl Grey design for the Nexys Video board using Xilinx Vivado 2018.3.
+
+```console
+$ . /tools/xilinx/Vivado/2018.3/settings64.sh
+$ cd $REPO_TOP
+$ fusesoc --cores-root . build lowrisc:systems:top_earlgrey_nexysvideo
+```
+
+The resulting bitstream is located at `build/lowrisc_systems_top_earlgrey_nexysvideo_0.1/synth-vivado/lowrisc_systems_top_earlgrey_nexysvideo_0.1.bit`.
+
+## Create an FPGA bitstream loaded with boot rom elf
+This uses FPGA splice flow to load SW boot rom contents onto FPGA BRAMs
+and creates a embedded FPGA bitstream.
+Script assumes there is a pre-generated fpga bit file in the build directory at
+`build/lowrisc_systems_top_earlgrey_nexysvideo_0.1/synth-vivado/lowrisc_systems_top_earlgrey_nexysvideo_0.1.bit`.
+The SW boot rom mem file is auto generated.
+
+* Usage:
+```console
+$ cd $REPO_TOP
+$ ./util/fpga/splice_nexysvideo.sh
+```
+
+The resulting updated bitfile is located at the same place as
+raw vivado bitfile with a name `splice.bit` appended at
+`build/lowrisc_systems_top_earlgrey_nexysvideo_0.1/synth-vivado/lowrisc_systems_top_earlgrey_nexysvideo_0.1.splice.bit`
+
+
+## Flash the bitstream onto the FPGA
+
+To flash the bitstream onto the FPGA you need to use either the Vivado GUI or the command line.
+
+### Using the command line
+
+Use the following command to program the FPGA with fusesoc.
+
+```console
+$ . /tools/xilinx/Vivado/2018.3/settings64.sh
+$ cd $REPO_TOP
+$ fusesoc --cores-root . pgm lowrisc:systems:top_earlgrey_nexysvideo:0.1
+```
+
+Note: `fusesoc pgm` is broken for edalize versions up to (and including) v0.1.3.
+You can check the version you're using with `pip3 show edalize`.
+
+### Using the Vivado GUI
+
+```console
+$ . /tools/xilinx/Vivado/2018.3/settings64.sh
+$ cd $REPO_TOP
+$ make -C build/lowrisc_systems_top_earlgrey_nexysvideo_0.1/synth-vivado build-gui
+```
+
+Now the Vivado GUI opens and loads the project.
+
+* Connect the FPGA board to the PC and turn it on.
+* In the navigation on the left, click on *PROGRAM AND DEBUG* > *Open Hardware Manager* > *Open Target* > *Auto Connect*.
+* Vivado now enumerates all boards and connects to it. (Note on Vivado 2018.1 you may get an error the first time and have to do auto connect twice.)
+* Click on *Program Device* in the menu on the left (or at the top of the screen).
+* A dialog titled *Program Device" pops up. Select the file *lowrisc_systems_top_earlgrey_nexysvideo_0.1.bit* as *Bitstream file*, and leave the *Debug probes file* empty.
+* Click on *Program* to flash the FPGA with the bitstream.
+* The FPGA is ready as soon as the programming finishes.
+
+
+## Testing the demo design
+
+The Earl Grey toplevel design comes with demo software that shows off some capabilities of the design.
+
+* Use a Micro USB cable to connect the PC with the *PROG*-labeled connector on the board.
+* Use a second Micro USB cable to connect the PC with the *UART*-labled connector on the board.
+* After connecting the UART, use `dmesg` to determine which serial port was assigned. It should be named `/dev/ttyUSB*`, e.g. `/dev/ttyUSB0`.
+* Ensure that you have sufficient access permissions to the device, check `ls -l /dev/ttyUSB*`. The udev rules given in the Vivado installation instructions ensure this.
+* Generate the bitstream and flash it to the FPGA as described above.
+* Open a serial console (use the device file determined before) and connect.
+ Settings: 230400 baud, 8N1, no hardware or software flow control.
+ ```console
+ screen /dev/ttyUSB0 230400
+ ```
+ Note that the Nexsys Video demo program that comes installed on the
+ board runs the UART at 115200 baud so expect to see garbage
+ characters if that is running (e.g. you connect the serial console
+ before using Vivado to program your new bitstream or you press the
+ *PROG* button that causes the FPGA to reprogram from the code in
+ the on-board SPI flash)
+
+* On the Nexys Video board, press the red button labeled *CPU_RESET*.
+* Observe the output both on the board and the serial console. Type any text into the console window.
+* Exit `screen` by pressing CTRL-a k, and confirm with y.
+
+## Develop with the Vivado GUI
+
+Sometimes it is helpful to use the Vivado GUI to debug a design.
+fusesoc makes that easy, with one small caveat: by default fusesoc copies all source files into a staging directory before the synthesis process starts.
+This behavior is helpful to create reproducible builds and avoids Vivado modifying checked-in source files.
+But during debugging this behavior is not helpful.
+The `--no-export` option of fusesoc disables copying the source files into the staging area, and `--setup` instructs fusesoc to only create a project file, but not to run the synthesis process.
+
+```console
+$ # only create Vivado project file
+$ fusesoc --cores-root . build --no-export --setup lowrisc:systems:top_earlgrey_nexysvideo
+```
+
+## Connect with OpenOCD and debug
+
+To connect the FPGA with OpenOCD, run the following command
+
+```console
+$ cd $REPO_TOP
+$ /tools/openocd/bin/openocd -s util/openocd -f board/lowrisc-earlgrey-nexysvideo.cfg
+```
+
+To actually debug through OpenOCD, it must either be connected through telnet or GDB.
+
+### Debug with OpenOCD
+
+The following is an example for using telnet
+
+```console
+$ telnet localhost 4444 // or whatever port that is specificed by the openocd command above
+$ mdw 0x8000 0x10 // read 16 bytes at address 0x8000
+```
+
+### Debug with GDB
+
+An example connection with GDB, which prints the registers after the connection to OpenOCD is established
+
+```console
+$ cd $REPO_TOP
+$ riscv32-unknown-elf-gdb -ex "target extended-remote :3333" -ex "info reg" sw/boot_rom/boot_rom.elf
+```
+
+#### Common operations with GDB
+
+Examine 16 memory words in the hex format starting at 0x200005c0
+
+```console
+(gdb) x/16xw 0x200005c0
+```
+
+Press enter again to print the next 16 words.
+Use `help x` to get a description of the command.
+
+If the memory content contains program text it can be disassembled
+
+```console
+(gdb) disassemble 0x200005c0,0x200005c0+16*4
+```
+
+Displaying the memory content can also be delegated to OpenOCD
+
+```console
+(gdb) monitor mdw 0x200005c0 16
+```
+
+Use `monitor help` to get a list of supported commands.
+
+To change the program which is debugged the `file` command can be used.
+This will update the symbols which are used to get information about the program.
+It is especially useful in the context of our `boot_rom.elf`, which resides in the ROM region, which will eventually jump to a different executable as part of the flash region.
+
+```console
+(gdb) file sw/tests/hello_world/hello_world.elf
+(gdb) disassemble 0x200005c0,0x200005c0+16*4
+```
+
+The output of the disassemble should now contain additional information.
diff --git a/doc/ug/getting_started_sw.md b/doc/ug/getting_started_sw.md
new file mode 100644
index 0000000..126dfff
--- /dev/null
+++ b/doc/ug/getting_started_sw.md
@@ -0,0 +1,19 @@
+# Build Software
+
+## Prerequisites
+
+_Make sure you followed the install instructions to [prepare the system](install_instructions.html#system-preparation) and install the [compiler toolchain](install_instructions.html#compiler-toolchain)._
+
+## Building software
+
+```console
+$ cd $REPO_TOP/sw/hello_world
+$ make CC=/tools/riscv/bin/riscv32-unknown-elf-gcc
+```
+
+The build process produces a variety of output files.
+
+* `.elf`: the linked program in ELF format
+* `.bin`: the linked program as plain binary
+* `.dis`: the disassembled program
+* `.vmem`: a Verilog memory file which can be read by `$readmemh()` in Verilog code
diff --git a/doc/ug/getting_started_verilator.md b/doc/ug/getting_started_verilator.md
new file mode 100644
index 0000000..ecfb073
--- /dev/null
+++ b/doc/ug/getting_started_verilator.md
@@ -0,0 +1,189 @@
+# Getting started with Verilator
+
+## About Verilator
+
+Verilator is a cycle-accurate simulation tool.
+It translates synthesizable Verilog code into a simulation program in C++, which is then compiled and executed.
+
+## Prerequisites
+
+_Make sure you followed the install instructions to [prepare the system](install_instructions.md#system-preparation) and to install the [software development tools](install_instructions.md#software-development) and [Verilator](install_instructions.md#verilator)._
+
+## Simulating a design with Verilator
+
+First the simulation needs to built itself.
+
+```console
+$ cd $REPO_TOP
+$ fusesoc --cores-root . sim --build-only lowrisc:systems:top_earlgrey_verilator
+```
+
+Then we need to build software to run on the simulated system.
+There are 3 memory types: ROM, RAM and Flash.
+By default, the system will first execute out of ROM and then jump to flash.
+A program needs to be built for each until ROM functionality for code download is ready
+
+For that purpose compile the demo program with "simulation" settings, which adjusts the frequencies to better match the simulation speed.
+
+```console
+$ cd $REPO_TOP/sw/boot_rom
+$ make clean
+$ make
+$
+$ cd $REPO_TOP/sw/hello_world
+$ make clean
+$ make SIM=1
+```
+
+Now the simulation can be run.
+The program listed after `--rominit` and `--flashinit` are loaded into the system's respective memories and start executing immediately.
+
+```console
+$ cd $REPO_TOP
+$ build/lowrisc_systems_top_earlgrey_verilator_0.1/sim-verilator/Vtop_earlgrey_verilator --rominit=sw/boot_rom/boot_rom.vmem \
+$ --flashinit=sw/hello_world/hello_world.vmem
+```
+
+To stop the simulation press CTRL-c.
+
+## Interacting with the simulated UART
+
+The simulation contains code to create a virtual UART port.
+When starting the simulation you should see a message like
+
+```console
+UART: Created /dev/pts/11 for uart0. Connect to it with any terminal program, e.g.
+$ screen /dev/pts/11
+```
+
+Use any terminal program, e.g. `screen` to connect to the simulation.
+If you only want to see the program output you can use `cat` instead.
+
+```console
+$ # to only see the program output
+$ cat /dev/pts/11
+
+$ # to interact with the simulation
+$ screen /dev/pts/11
+```
+
+You can exit `screen` (in the default configuration) by pressing `CTRL-a k` and confirm with `y`.
+
+## See GPIO output
+
+The simulation includes a DPI module to send all GPIO outputs to a POSIX FIFO file.
+The changing output can be observed with
+
+```console
+$ cat gpio0
+```
+
+Passing input is currently not supported.
+
+## Connect with OpenOCD to the JTAG port
+
+The simulation includes a "virtual JTAG" port to which OpenOCD can connect using its `remote_bitbang` driver.
+All necessary configuration files are included in this repository.
+
+Run the simulation, then connect with OpenOCD using the following command.
+
+```console
+$ cd $REPO_TOP
+$ /tools/openocd/bin/openocd -s util/openocd -f board/lowrisc-earlgrey-verilator.cfg
+```
+
+You can also run the debug compliance test suite built into OpenOCD.
+
+```console
+$ cd $REPO_TOP
+$ /tools/openocd/bin/openocd -s util/openocd -f board/lowrisc-earlgrey-verilator.cfg -c 'init; riscv test_compliance; shutdown'
+```
+## SPI device test interface
+
+The simulation contains code to monitor the SPI bus and provide a master interface to allow interaction with the `spi_device`.
+When starting the simulation you should see a message like
+
+```console
+SPI: Created /dev/pts/4 for spi0. Connect to it with any terminal program, e.g.
+$ screen /dev/pts/4
+NOTE: a SPI transaction is run for every 4 characters entered.
+SPI: Monitor output file created at /auto/homes/mdh10/github/opentitan/spi0.log. Works well with tail:
+$ tail -f /auto/homes/mdh10/github/opentitan/spi0.log
+```
+
+Use any terminal program, e.g. `screen` or `microcom` to connect to the simulation.
+
+```console
+$ screen /dev/pts/4
+```
+
+Microcom seems less likely to send unexpected control codes when starting:
+```console
+$ microcom -p /dev/pts/4
+```
+
+The terminal will accept (but not echo) characters.
+After 4 characters are received a 4-byte SPI packet is sent containing the characters.
+The four characters received from the SPI transaction are echoed to the terminal.
+The `hello_world` code will print out the bytes received from the SPI port (substituting _ for non-printable characters).
+The `hello_world` code initially sets the SPI transmitter to return `SPI!` (so that should echo after the four characters are typed) and when bytes are received it will invert their bottom bit and set them for transmission in the next transfer (thus the Nth set of four characters typed should have an echo of the N-1th set with bottom bit inverted).
+
+The SPI monitor output is written to a file.
+It may be monitored with `tail -f` which conveniently notices when the file is truncated on a new run, so does not need restarting between simulations.
+The output consists of a textual "waveform" representing the SPI signals.
+
+## USB device test interface
+
+
+The simulation contains code to exercise and monitor the USB bus and provide a host interface to allow interaction with the `usbdev` and `usbuart` modules.
+When starting the simulation you should see a message like
+
+```console
+USB: FIFO pipe created at /auto/homes/mdh10/github/opentitan/usb0. Run
+$ cat /auto/homes/mdh10/github/opentitan/usb0
+to observe the output.
+```
+
+The test code currently acts as a host to generate the basic USB control transactions to setup the interface (set the Device ID, read the Device Descriptor), send regular (but not at 1ms spacing) Start-of-Frame packets with incrementing frame number, do an IN bulk transfer from endpoint 1 and occasionally an OUT bulk transfer to endpoint 1.
+The code will finish the simulation after a small number of USB Frames if tracing is enabled and a large number if tracing is not enabled.
+The test code is written directly in the `usbdpi.c` main loop and is fragile with regards to timing. (See [Issue #NNN](https://github.com/lowRISC/opentitan/issues/NNN))
+
+The test code is sufficient to work with the `usbuart` and `hello_world` program and will display the output characters as they arrive in USB packets and send the string `Hi!` to the simulation, which will be echoed and cause the GPIOs to change.
+
+The test code is sufficient to work with the `usbdev` and `hello_usbdev` program and will configure the interface and send the string `Hi!` to the simulation, which will be written to the UART.
+
+The USB mointor can be configured (in the dpi) to output low level bit events from the USB bus, but by default will display higher level packet information. It outputs to a named pipe like the GPIO.
+
+It can be convenient to monitor both the GPIO and USB outputs in the same terminal window.
+
+```console
+$ cd $REPO_TOP
+$ cat gpio0 & cat usb0
+```
+
+Or:
+
+```console
+$ cd $REPO_TOP
+$ tail -f gpio0 usb0
+```
+
+Because the setup process (connect terminal program to `/dev/pts/` for UART, start monitoring the GPIO and USB named pipes) can take some time the `usbdpi` code has a 7 second sleep (with countdown) when it is called for simulation cycle 0.
+This delay starts after all the ptys/pipes/files have been opened but before any action and gives enough time for the correct commands to be started in other terminal windows.
+
+## DPI Source
+
+The I/O interfaces described above are implemented using the DPI interface to Verilator.
+The code for these is stored in the repo at `hw/dv/dpi` with a sub-directory for each module.
+There should be a fusesoc `.core` file in each sub-directory.
+
+## Generating waveforms
+
+With the `--trace` argument the simulation generates a FST signal trace which can be viewed with Gtkwave (only).
+Tracing slows down the simulation by roughly factor of 1000.
+
+```console
+$ cd $REPO_TOP
+$ build/lowrisc_systems_top_earlgrey_verilator_0.1/sim-verilator/Vtop_earlgrey_verilator --meminit=sw/hello_world/hello_world.vmem --trace
+$ gtkwave sim.fst
+```
diff --git a/doc/ug/github_notes.md b/doc/ug/github_notes.md
new file mode 100644
index 0000000..0eb3676
--- /dev/null
+++ b/doc/ug/github_notes.md
@@ -0,0 +1,336 @@
+# GitHub Notes
+
+The OpenTitan source tree is maintained on GitHub in a [monolithic repository](https://github.com/lowRISC/opentitan) called opentitan.
+
+This file provides some notes on using GitHub for developing in the
+monolithic repository based on notes taken by a relatively inexperienced git
+user. There is much more to using git, a possible next step is to
+reference [Resources to learn Git](https://try.github.io/).
+
+## Getting a working repository
+
+To develop in the repo you will need to get a copy on your local
+machine. To allow contributions to be made back to the main repo
+(through a process called a Pull Request) you need to first make your
+own copy of the repo on GitHub then transfer that to your local
+machine.
+
+You will need to log in to GitHub, go to the [opentitan repository](https://github.com/lowRISC/opentitan) and click on
+"Fork". This will generate a copy in your GitHub area that you can
+use.
+
+Then setup git on your local machine and set some standard parameters
+to ensure your name and email are correctly inserted. These commands
+set them globally for all your use of git on the local machine so you
+may have done this step already, there is no need to repeat it. (See
+below if you want to use a different email address for this repo.)
+
+Check the parameters:
+```console
+$ git config -l
+```
+
+And if they do not exist then set them:
+
+```console
+$ git config --global user.name "My Name"
+$ git config --global user.email "my_name@email.com"
+```
+
+`git` will take care of prompting for your GitHub user name and
+password when it is required, but it can be useful to allow it to
+cache these credentials (set here to an hour using the timeout in
+seconds) so you don't have to enter every time:
+
+```console
+$ git config --global credential.helper 'cache --timeout=3600'
+```
+
+Now make a local copy of your GitHub copy of the repo and let git know
+that it is derived from the **upstream** lowRISC repo:
+
+```console
+$ cd <where the repo should go>
+$ git clone https://github.com/$GITHUB_USER/opentitan.git
+$ cd opentitan
+$ git remote add upstream https://github.com/lowRISC/opentitan.git
+$ git remote set-url --push upstream disabled
+$ git remote -v
+```
+
+The `git remote -v` should give your GitHub copy as **origin** and the
+lowRISC one as **upstream**. Making this link will allow you to keep your
+local and GitHub repos up to date with the lowRISC one.
+
+If you want a different email address (or name) for the lowRISC repo then
+you can set it locally in the repo (similar to above but without the
+--global flag). This command must be executed from a directory inside
+the local copy of the repo. (There is no need for the first `cd` if
+you are following the previous step.)
+
+
+```console
+$ cd opentitan
+$ git config user.email "my_name@lowrisc.org"
+```
+
+## Working in your local repo
+
+The repo that you have created locally will initially be on the
+**master** branch. In general you should not make changes on this
+branch, just use it to track your GitHub repo and synchronize with the
+lowRISC master repo.
+
+The typical workflow is to make your own branch which it is
+conventional to name based on the change you are making:
+
+```console
+$ git checkout -b forchange
+$ git status
+```
+
+The status will initially indicate there are no changes, but as you
+add, delete or edit files it will let you know the state of things.
+
+Once you are happy with your changes, commit them to the local repo by adding the files to the changes (if things are clean you can add using `git commit -a` instead of a number of `add` commands):
+
+```console
+$ git add...
+$ git commit
+```
+
+The commit will make you add a message. The first line of this is a
+short summary of the change. It should be prefixed with a word in
+square brackets indicating the area being changed, typically the IP or
+Tool name. For example:
+
+```
+[doc/um] Add notes on using GitHub and the repo
+```
+
+After this there should be a blank line and the main description of
+the change. If you are fixing an issue then add a line at the end of
+the message `Fixes #nn` where `nn` is the issue number. This will link
+the fix and close out the issue when it is added to the lowRISC repo.
+
+When you have finished everything locally (it is good practice to do a
+status check to ensure things are clean) you can push your branch (eg
+forchange) to **your** GitHub repo (the **origin**):
+
+```console
+$ git status
+$ git push origin forchange
+```
+
+Then you need to go to your repo in Github and either select branch
+from the pulldown or often there is a status message that you can
+click on, review the changes and make a Pull Request. You can add
+reviewers and get your change reviewed.
+
+If you need to make changes to satisfy the reviews then you do that in
+your local repo on the same branch. You will need to `add` files and
+commit again. It is normally best to squash your changes into a single
+commit by doing it with `--amend` which will give you a chance to edit
+the message. If you do this you need to force `-f` the push back to
+your repo.
+
+```console
+$ git add...
+$ git commit --amend
+$ git status
+$ git push -f origin forchange
+```
+
+Once the reviewers are happy you can "Squash and merge" the Pull
+Request on GitHub, delete the branch there (it offers to do this when
+you do the merge). You can delete the branch in your local repo with:
+
+```console
+$ git checkout master
+$ git branch -D forchange
+```
+
+## Update your repo with changes in the lowRISC repo
+
+There is a little work to do to keep everything in sync. Normally you
+want to first get your local repo master branch up to date with the
+lowRISC repo (**upstream**) and then you use that to update your GitHub
+copy (**origin**).
+
+```console
+$ git checkout master
+$ git pull upstream master
+$ git push origin
+```
+
+If you do this while you have changes on some other branch then before
+a Pull Request will work you need to be sure your branch merges
+cleanly into the new lowRISC repo. Assuming you got the local master
+branch up to date with the procedure above you can now **rebase** your
+changes on the new master. Assuming you have your changes on the local
+branch `forchange`:
+
+```console
+$ git checkout forchange
+$ git rebase master
+```
+
+If you are lucky this will just work, it unwinds your changes, gets
+the updated master and replays your changes. If there are conflicts
+then you need a big pot of coffee and patience (see next section).
+
+Once everything has rebased properly you can do
+
+
+```console
+$ git log
+```
+
+And see that the changes you commited on the branch are at the top of
+the log followed by the latest changes on the master branch.
+
+## Dealing with conflicts after a rebase
+
+If a rebase fails because of conflicts between your changes and the
+code you are rebasing to then git will leave your working directories
+in a bit of a mess and expect you to fix it. Often the conflict is
+simple (eg you and someone else added a new routine at the same place
+in the file) and resolution is simple (have both in the new
+output). Sometimes there is more to untangle if different changes were
+done to the same routine. In either case git has marked that you are
+in a conflict state and work is needed before you can go back to using
+your local git tree as usual.
+
+The git output actually describes what to do (once you are used to how
+to read it). For example:
+
+```
+$ git rebase master
+First, rewinding head to replay your work on top of it...
+Applying: [util][pystyle] Clean python style in single file tools
+Using index info to reconstruct a base tree...
+M util/diff_generated_util_output.py
+M util/build_docs.py
+Falling back to patching base and 3-way merge...
+Auto-merging util/build_docs.py
+CONFLICT (content): Merge conflict in util/build_docs.py
+Auto-merging util/diff_generated_util_output.py
+error: Failed to merge in the changes.
+Patch failed at 0001 [util][pystyle] Clean python style in single file tools
+Use 'git am --show-current-patch' to see the failed patch
+
+Resolve all conflicts manually, mark them as resolved with
+"git add/rm <conflicted_files>", then run "git rebase --continue".
+You can instead skip this commit: run "git rebase --skip".
+To abort and get back to the state before "git rebase", run "git rebase --abort".
+```
+
+The last line of this gives the ultimate out. You can abort the rebase
+and figure some other way to proceed. As it says, this is done with:
+
+```console
+$ git rebase --abort
+```
+
+After executing this command you are back to a clean tree with your
+changes intact, but they are still based on whatever the earlier state
+of the repo was. Normally you will have to resolve the conflict
+sometime, but the escape hatch can be useful if you don't have time
+immediately!
+
+In the normal case, read the output to find the file with the
+problem. In this case `Merge conflict in util/build_docs.py` (The merge
+of `util/diff_generated_util_output.py` was successful even though it
+is mentioned in the middle of what looks like error output.)
+
+If the file is opened with an editor the points at which there are
+conflicts will have diff-style change information embedded in to them. For example:
+
+```
+<<<<<<< HEAD
+import livereload
+
+import docgen.generate
+=======
+import docgen
+import livereload
+>>>>>>> [util][pystyle] Clean python style in single file tools
+
+```
+
+In this case the master tree (between `<<<<<<< HEAD` and `=======`)
+was modified to import `docgen.generate` rather than just `docgen` and
+the local tree (between `=======` and `>>>>>>>` followed by the first
+line of the commit message) had been changed to re-order the
+imports. These lines have to be edited to get the correct merged
+result and the diff markers removed. There may be multiple points in
+the file where fixes are needed. Once all conflicts have been
+addressed the file can be `git add`ed and once all files addressed the
+rebase continued.
+
+After the fix a status report will remind you where you are.
+```console
+$ git status
+rebase in progress; onto cb85dc4
+You are currently rebasing branch 'sastyle' on 'cb85dc4'.
+ (all conflicts fixed: run "git rebase --continue")
+
+Changes to be committed:
+ (use "git reset HEAD <file>..." to unstage)
+
+ modified: diff_generated_util_output.py
+ modified: build_docs.py
+
+Changes not staged for commit:
+ (use "git add <file>..." to update what will be committed)
+ (use "git checkout -- <file>..." to discard changes in working directory)
+
+ modified: build_docs.py
+
+```
+
+This gives the same instructions as the original merge failure message
+and gives the comfort that all conflicts were fixed. To finish up you
+need to follow the instructions.
+
+```console
+$ git add build_docs.py
+$ git rebase --continue
+Applying: [util][pystyle] Clean python style in single file tools
+```
+
+If there were more than one patch outstanding (which isn't usual if
+you use the `commit --amend` flow) then you may get subsequent
+conflicts following the `rebase --continue` as other patches are
+replayed.
+
+You can check the rebase worked as expected by looking at the log to
+see your branch is one commit (or more if there were more) ahead of
+the master branch.
+
+```console
+$ git log
+
+commit dd8721d2b1529c575c4aef988219fbf2ecd3fd1b (HEAD -> sastyle)
+Author: Mark Hayter <mark.hayter@gmail.com>
+Date: Thu Jan 10 09:41:20 2019 +0000
+
+ [util][pystyle] Clean python style in single file tools
+
+ Result of lintpy.py --fix on the diff and build_docs tools
+
+ Tested with ./diff_generated_util_output.py master
+
+commit cb85dc42199e925ad09c45d33f6483a14764b93e (upstream/master, origin/master, origin/HEAD, master)
+
+```
+
+This shows the new commit (`HEAD` of the branch `sastyle`) and the
+preceding commit is at the `master` branch (and at the same point as
+`master` on both `origin` and `upstream` so everything is in sync at
+master).
+
+At this point the conflicts have been cleared and the local repo can
+be used as expected.
+
+You may find it useful to change the default for the way git reports conflicts in a file. See [Take the pain out of git conflict resolution: use diff3](https://blog.nilbus.com/take-the-pain-out-of-git-conflict-resolution-use-diff3/)
diff --git a/doc/ug/img/install_vivado/step1.png b/doc/ug/img/install_vivado/step1.png
new file mode 100644
index 0000000..5eac59a
--- /dev/null
+++ b/doc/ug/img/install_vivado/step1.png
Binary files differ
diff --git a/doc/ug/img/install_vivado/step2.png b/doc/ug/img/install_vivado/step2.png
new file mode 100644
index 0000000..a76db34
--- /dev/null
+++ b/doc/ug/img/install_vivado/step2.png
Binary files differ
diff --git a/doc/ug/img/install_vivado/step3.png b/doc/ug/img/install_vivado/step3.png
new file mode 100644
index 0000000..7459026
--- /dev/null
+++ b/doc/ug/img/install_vivado/step3.png
Binary files differ
diff --git a/doc/ug/img/install_vivado/step4.png b/doc/ug/img/install_vivado/step4.png
new file mode 100644
index 0000000..a597008
--- /dev/null
+++ b/doc/ug/img/install_vivado/step4.png
Binary files differ
diff --git a/doc/ug/img/install_vivado/step5.png b/doc/ug/img/install_vivado/step5.png
new file mode 100644
index 0000000..8c4c853
--- /dev/null
+++ b/doc/ug/img/install_vivado/step5.png
Binary files differ
diff --git a/doc/ug/img/install_vivado/step6.png b/doc/ug/img/install_vivado/step6.png
new file mode 100644
index 0000000..0d6e175
--- /dev/null
+++ b/doc/ug/img/install_vivado/step6.png
Binary files differ
diff --git a/doc/ug/img/install_vivado/step7.png b/doc/ug/img/install_vivado/step7.png
new file mode 100644
index 0000000..bee3ac3
--- /dev/null
+++ b/doc/ug/img/install_vivado/step7.png
Binary files differ
diff --git a/doc/ug/img/install_vivado/step8.png b/doc/ug/img/install_vivado/step8.png
new file mode 100644
index 0000000..50288ad
--- /dev/null
+++ b/doc/ug/img/install_vivado/step8.png
Binary files differ
diff --git a/doc/ug/img/install_vivado/vivado_download.png b/doc/ug/img/install_vivado/vivado_download.png
new file mode 100644
index 0000000..adde251
--- /dev/null
+++ b/doc/ug/img/install_vivado/vivado_download.png
Binary files differ
diff --git a/doc/ug/index.md b/doc/ug/index.md
new file mode 100644
index 0000000..ed1c5ae
--- /dev/null
+++ b/doc/ug/index.md
@@ -0,0 +1,42 @@
+# User Guides
+
+* Getting Started
+ * [Getting started](getting_started.md)
+ * [Quickstart](quickstart.md)
+ * [Notes on using GitHub and local git](github_notes.md)
+ * [Build software](getting_started_sw.md)
+ * [Getting started with Verilator](getting_started_verilator.md)
+ * [Getting started on FPGAs](getting_started_fpga.md)
+ * [Obtaining an FPGA board](fpga_boards.md)
+ * [Installing Xilinx Vivado](install_instructions.md#xilinx-vivado)
+ * *Getting started with a design* <Coming Soon>
+ * *Getting started with verification* <Coming Soon>
+* [Work with hardware code in external repositories](vendor_hw.md)
+* [Design Methodology](design.md)
+ * Importance of Verilog Style Guide
+ * Explanation of Comportability concept
+ * Milestones and Tracking
+ * Documentation
+ * Register Tool Usage
+ * Linting Methodology
+ * Assertions Methodology
+ * CDC Methodology
+ * DFT
+ * Generated Code
+* *Verification Methodology* <Coming Soon>
+ * Verification strategy overview
+ * How do we define Verification completion
+ * Current verification status of IP and definition of milestones
+ * Tools
+ * Test planning
+ * Progress and tracking
+ * Code coverage output
+ * How do we report status
+ * Overview of in-tree helper classes, test benches, etc.
+* *Validation Methodology* <Coming Soon>
+ * How to download bit stream
+ * What tests exist today
+ * How to run tests
+ * How does this differ from verification
+ * How to add tests
+* [List of Top-Level Designs](system_list.md)
diff --git a/doc/ug/install_instructions.md b/doc/ug/install_instructions.md
new file mode 100644
index 0000000..64a0ad2
--- /dev/null
+++ b/doc/ug/install_instructions.md
@@ -0,0 +1,266 @@
+# Install Build Requirements
+
+{{% toc 3 }}
+
+## System preparation
+
+_**Note for all Windows users:** many tools we're using can in theory work on Windows.
+However, we didn't test on Windows and things will be broken there.
+Unless you are experienced in debugging various tool problems on Windows using Linux will improve your developer experience significantly._
+
+By convention tools which are not provided through a package manager will be installed into `/tools`.
+This directory can be replaced by any sufficiently large directory without spaces in the directory name.
+It is assumed that the user executing the build instructions has full write permissions to this directory; the following commands ensure that.
+
+```console
+$ sudo mkdir /tools
+$ sudo chown $(id -un) /tools
+```
+
+### Install required software
+
+A number of software packages from the distribution's package manager is required.
+All installation instructions below are for Ubuntu 16.04.
+Adjust as necessary for other Linux distributions.
+
+```console
+$ sudo apt-get install python3 python3-pip python3-setuptools build-essential ninja-build pkgconf srecord zlib1g-dev
+```
+
+Some tools in this repository are written in Python 3 and require
+Python dependencies to be installed through `pip`. (Note that the
+`diff_generated_util_output.py` tool works better with Python3.6 or
+later where the order is preserved in `dict` types, earlier versions
+of Python will show spurious differences caused by things being
+reordered.)
+
+```console
+$ cd $REPO_TOP
+$ pip3 install --user -r python-requirements.txt
+```
+
+The `pip` installation instructions use the `--user` flag to install without root permissions.
+Binaries are installed to `~/.local/bin`; check that this directory is listed in your `PATH` by running `fusesoc --version`.
+If the `fusesoc` binary is not found, add `~/.local/bin` to your `PATH`, e.g. by modifying your `~/.bashrc` file.
+
+## Software development
+
+### Compiler toolchain
+
+To build software you need a baremetal rv32imc compiler toolchain.
+You can either build your own or use a prebuilt one.
+We recommend installing the toolchain to `/tools/riscv`.
+
+#### Option 1 (recommended): Use the lowRISC-provided prebuilt GCC toolchain
+
+lowRISC provides a prebuilt GCC toolchain for the OpenTitan project.
+Download the file starting with `lowrisc-toolchain-gcc-rv32imc-` from [GitHub releases](https://github.com/lowRISC/lowrisc-toolchains/releases/latest) and unpack it to `/tools/riscv`.
+
+Or alternatively, use a in-tree helper script.
+
+```cmd
+$ cd $REPO_TOP
+$ ./util/get-toolchain.py
+```
+
+#### Option 2: Compile your own GCC toolchain
+
+1. Install all build prerequisites listed [in the documentation](https://github.com/riscv/riscv-gnu-toolchain/#prerequisites).
+
+2. Build the toolchain
+ ```console
+ $ git clone --recursive https://github.com/riscv/riscv-gnu-toolchain
+ $ cd riscv-gnu-toolchain
+ $ ./configure --prefix=/tools/riscv --with-abi=ilp32 --with-arch=rv32imc --with-cmodel=medany
+ $ make
+ ```
+
+The `make` command installs the toolchain to `/tools/riscv`, no additional `make install` step is needed.
+
+### OpenOCD
+
+OpenOCD is a tool to connect with the target chip over JTAG and similar transports.
+It also provides a GDB server which is an "intermediate" when debugging software on the chip with GDB.
+
+Unfortunately the upstream sources of OpenOCD do not contain all necessary patches to support RISC-V, and hence typical distribution packages don't work.
+We therefore need to build OpenOCD from source from a forked repository.
+
+For FTDI support the libraries libftdi > 1.0 and libusb > 1.0 are needed.
+Install those packages prior to building OpenOCD.
+
+```console
+$ sudo apt-get install libftdi1-dev libusb-1.0-0-dev
+$ git clone https://github.com/riscv/riscv-openocd.git
+$ cd riscv-openocd
+$ ./bootstrap
+$ mkdir build
+$ cd build
+$ ../configure --enable-ftdi --enable-verbose-jtag-io --disable-vsllink --enable-remote-bitbang --prefix=/tools/openocd
+$ make -j4
+$ sudo make install
+```
+
+## Verilator
+
+Even though Verilator is packaged for most Linux distributions these versions tend to be too old to be usable.
+We recommend compiling Verilator from source, as outlined here.
+
+First some build prerequisites need to be installed.
+For Ubuntu the following packages are needed.
+```console
+$ sudo apt-get install git make autoconf g++ flex bison
+```
+
+### Install Verilator
+
+Then you can fetch, build and install Verilator itself.
+
+```console
+$ export VERILATOR_VERSION=4.010
+
+$ git clone http://git.veripool.org/git/verilator
+$ cd verilator
+$ git checkout v$VERILATOR_VERSION
+
+$ autoconf
+$ ./configure --prefix=/tools/verilator/$VERILATOR_VERSION
+$ make
+$ make install
+```
+
+After installation you need to add `/tools/verilator/$VERILATOR_VERSION/bin` to your `PATH` environment variable.
+
+## Xilinx Vivado
+
+### System requirements
+
+This guide makes assumes the following system setup.
+
+* A reasonably powerful PC running Linux.
+ Using a virtual machine can work, but will slow down builds considerably.
+ 8 GB of RAM or more are highly recommended.
+* Physical access to that machine, root permissions and a graphical environment.
+* Python 3.5 or newer. Python 3.6+ is recommended.
+* 60 GB or more of disk space.
+ EDA tools like Xilinx Vivado can easily take up 40 GB each.
+* We develop and test on the following Linux distributions:
+ * Ubuntu 16.04 LTS
+ * Debian testing
+ * openSUSE Tumbleweed
+ * TODO: Check RHEL/CentOS and SLES (used in many commercial environments)
+
+TODO: Be more specific about the system requirements, especially the Linux distribution.
+
+### About Xilinx Vivado
+
+To generate a bitstream for Xilinx devices a software called Vivado is required.
+Vivado is provided by Xilinx, it is freeware for certain (smaller) FPGA devices but requires a commercial license for larger FPGAs.
+The free version is called "WebPACK", the commercial version "Design Edition".
+The installation instructions below are valid for both installation methods.
+
+Most lowRISC designs support at least one FPGA board which works with a free WebPACK license.
+
+### Install Xilinx Vivado
+
+Vivado can be installed in two ways: either through an "All OS installer Single-File Download", or via the "Linux Self Extracting Web Installer".
+Neither option is great:
+the "All OS installer" is a huge download of around 20 GB (and the Xilinx download servers seem to be overloaded regularly), but it supports an unattended installation.
+The web installer downloads only necessary subsets of the software, which significantly reduces the download size.
+But unfortunately it doesn't support the batch mode for unattended installations, requiring users to click through the GUI and select the right options.
+
+To get started faster we use the web installer in the following.
+
+1. Go to the [Xilinx download page](https://www.xilinx.com/support/download.html) and download two files for the current version of Vivado.
+ (We used Vivado 2018.3 to prepare this guide.)
+ 1. The file "Vivado HLx <VERSION>: WebPACK and Editions - Linux Self Extracting Web Installer".
+ 2. The "Digests" file below the download.
+
+ 
+
+ You need to register for a free Xilinx account to download the software, and you'll need it again later to install the software.
+ Create a new account if you don't have one yet.
+
+2. Before you proceed ensure that the download didn't get corrupted by verifying the checksum.
+
+ ```console
+ $ sha512sum --check Xilinx_Vivado_SDK_Web_2018.3_1207_2324_Lin64.bin.digests
+ Xilinx_Vivado_SDK_Web_2018.3_1207_2324_Lin64.bin: OK
+ sha512sum: WARNING: 22 lines are improperly formatted
+ ```
+
+ If you see an "OK" after the downloaded file proceed to the next step. Otherwise delete the download and start over. (You can ignore the warning produced by `sha512sum`.)
+3. Run the graphical installer.
+
+ ```console
+ $ sh Xilinx_Vivado_SDK_Web_2018.3_1207_2324_Lin64.bin
+ ```
+
+4. Now you need to click through the installer.
+ Click "Next" on the first screen.
+
+ 
+
+5. Type in your Xilinx User ID (your email address) and the associated password.
+ Choose the "Download and Install Now" option.
+ Click "Next" to continue.
+
+ 
+
+6. Click all "I Agree" checkboxes, and click on "Next" to continue.
+
+ 
+
+7. Choose "Vivado HL WebPACK" if you do not have a commercial Vivado license, or "Vivado HL Design Edition" if you have a valid license.
+ In this walk through we'll install the WebPACK edition.
+
+ 
+
+8. Choose the features to install.
+ You can restrict the features to the ones shown in the screenshot below.
+ Click "Next" to continue.
+
+ 
+
+9. Choose an installation location.
+ Any location which doesn't have a whitespace in its path and enough free space is fine.
+ We use `/tools` in our example, but a path in `/opt` or within the home directory works equally well.
+ Click "Next" to continue.
+
+ 
+
+10. Double-check the installation summary and click on "Next" to start the installation process.
+
+ 
+
+11. Now Vivado is downloaded and installed, a process which can easily take multiple hours.
+
+ 
+
+12. As soon as the installation has completed close the installer and you're now ready to use Vivado!
+
+### Device permissions: udev rules
+
+To program FPGAs user using Vivado typically needs to have permission to USB devices connected to the PC.
+Depending on your security policy you can take different steps to enable this access.
+One way of doing so is given in the udev rule outlined below.
+
+To do so, create a file named `/etc/udev/rules.d/90-lowrisc.rules` and add the following content to it:
+
+```
+# Grant access to board peripherals connected over USB:
+# - The USB devices itself (used e.g. by Vivado to program the FPGA)
+# - Virtual UART at /dev/tty/XXX
+
+# Future Technology Devices International, Ltd FT2232C/D/H Dual UART/FIFO IC
+# used on Digilent boards
+ACTION=="add|change", SUBSYSTEM=="usb|tty", ATTRS{idVendor}=="0403", ATTRS{idProduct}=="6010", ATTRS{manufacturer}=="Digilent", MODE="0666"
+
+# Future Technology Devices International, Ltd FT232 Serial (UART) IC
+ACTION=="add|change", SUBSYSTEM=="usb|tty", ATTRS{idVendor}=="0403", ATTRS{idProduct}=="6001", MODE="0666"
+```
+
+You then need to reload the udev rules:
+
+```console
+# udevadm control --reload
+```
diff --git a/doc/ug/quickstart.md b/doc/ug/quickstart.md
new file mode 100644
index 0000000..decc522
--- /dev/null
+++ b/doc/ug/quickstart.md
@@ -0,0 +1,39 @@
+# Quickstart
+
+The environment variable `$REPO_TOP` is the top-level of the git source tree.
+
+## Simulation with Verilator
+
+_Make sure you followed the install instructions to [prepare the system](install_instructions.md#system-preparation) and to install the [software development tools](install_instructions.md#software-development) and [Verilator](install_instructions.md#verilator)._
+
+Build the simulator and the software and then run the simulation
+
+```console
+$ cd $REPO_TOP
+$ fusesoc --cores-root . sim --build-only lowrisc:systems:top_earlgrey_verilator
+$ make SIM=1 -C sw/boot_rom clean all
+$ make SIM=1 -C sw/tests/hello_world clean all
+$ build/lowrisc_systems_top_earlgrey_verilator_0.1/sim-verilator/Vtop_earlgrey_verilator --rominit=sw/boot_rom/boot_rom.vmem \
+$ --flashinit=sw/tests/hello_world/hello_world.vmem
+```
+
+See the [getting started](getting_started_verilator.md) for a complete guide.
+
+## Running on an FPGA
+
+This description assumes the usage of the Nexys Video board.
+
+_Make sure you followed the install instructions to [prepare the system](install_instructions.md#system-preparation) and to install the [software development tools](install_instructions.md#software-development) and [Xilinx Vivado](install_instructions.md#xilinx-vivado)._
+
+Build the software and the bitstream and then program the board
+
+```console
+$ cd $REPO_TOP
+$ make -C sw/boot_rom clean all
+$ make -C sw/tests/hello_world clean all
+$ . /tools/xilinx/Vivado/2018.3/settings64.sh
+$ fusesoc --cores-root . build lowrisc:systems:top_earlgrey_nexysvideo
+$ fusesoc --cores-root . pgm lowrisc:systems:top_earlgrey_nexysvideo:0.1
+```
+
+See the [getting started](getting_started_fpga.md) for a complete guide.
diff --git a/doc/ug/system_list.md b/doc/ug/system_list.md
new file mode 100644
index 0000000..5f24127
--- /dev/null
+++ b/doc/ug/system_list.md
@@ -0,0 +1,40 @@
+# List of Top-Level Designs
+
+This page lists all top-level designs and their targets that are contained within this repository.
+Click on the design name to get more information about the design.
+
+<table>
+ <thead>
+ <tr>
+ <th>Design</th>
+ <th>Internal Name</th>
+ <th>Simulation Targets</th>
+ <th>FPGA Targets</th>
+ <th>ASIC Targets</th>
+ <th>Description</th>
+ </tr>
+ </thead>
+ <tr>
+ <td><a href="/hw/top_earlgrey/doc/top_earlgrey.html">Earl Grey</a></td>
+ <td><pre>top_earlgrey</pre></td>
+ <td>
+ <ul>
+ <li>Verilator</li>
+ <li>?</li>
+ </ul>
+ </td>
+ <td>
+ <ul>
+ <li>Nexys Video*</li>
+ </ul>
+ </td>
+ <td>
+ <i>None yet.</i>
+ </td>
+ <td>
+ 0.1 release
+ </td>
+ </tr>
+</table>
+
+A `*` behind an FPGA board indicates it can be used with a free EDA tool license.
diff --git a/doc/ug/vendor_hw.md b/doc/ug/vendor_hw.md
new file mode 100644
index 0000000..f4a3aa7
--- /dev/null
+++ b/doc/ug/vendor_hw.md
@@ -0,0 +1,395 @@
+# Work with hardware code in external repositories
+
+OpenTitan is not a closed ecosystem: we incorporate code from third parties, and we split out pieces of our code to reach a wider audience.
+In both cases, we need to import and use code from external repositories in our OpenTitan code base.
+Read on for step-by-step instructions for common tasks, and for background information on the topic.
+
+## Summary
+
+Code in subdirectories of `hw/vendor` is imported (copied in) from external repositories (which may be provided by lowRISC or other sources).
+The external repository is called "upstream".
+Any development on imported in `hw/vendor` code should happen upstream when possible.
+Files ending with `.vendor.hjson` indicate where the upstream repository is located.
+
+In particular, this means:
+
+- If you find a bug in imported code or want to enhance it, report it upstream.
+- Follow the rules and style guides of the upstream project.
+ They might differ from our own rules.
+- Use the upstream mechanisms to do code changes. In many cases, upstream uses GitHub just like we do with Pull Requests.
+- Work with upstream reviewers to get your changes merged into their code base.
+- Once the change is part of the upstream repository, the `vendor_hw` tool can be used to copy the upstream code back into our OpenTitan repository.
+
+Read on for the longer version of these guidelines.
+
+Pushing changes upstream first isn't always possible or desirable: upstream might not accept changes, or be slow to respond.
+In some cases, code changes are needed which are irrelevant for upstream and need to be maintained by us.
+Our vendoring infrastructure is able to handle such cases, read on for more information on how to do it.
+
+## Background
+
+OpenTitan is developed in a "monorepo", a single repository containing all its source code.
+This approach is beneficial for many reasons, ranging from an easier workflow to better reproducibility of the results, and that's why large companies like [Google](https://ai.google/research/pubs/pub45424) and Facebook are using monorepos.
+Monorepos are even more compelling for hardware development, which cannot make use of a standardized language-specific package manager like npm or pip.
+
+At the same time, open source is all about sharing and a free flow of code between projects.
+We want to take in code from others, but also to give back and grow a wider ecosystem around our output.
+To be able to do that, code repositories should be sufficiently modular and self-contained.
+For example, if a CPU core is buried deep in a repository containing a full SoC design, people will have a hard time using this CPU core for their designs and contributing to it.
+
+Our approach to this challenge: develop reusable parts of our code base in an external repository, and copy the source code back into our monorepo in an automated way.
+The process of copying in external code is commonly called "vendoring".
+
+Vendoring code is a good thing.
+We continue to maintain a single code base which is easy to fork, tag and generally work with, as all the normal Git tooling works.
+By explicitly importing code we also ensure that no unreviewed code sneaks into our code base, and a "always buildable" configuration is maintained.
+
+But what happens if the imported code needs to be modified?
+Ideally, all code changes are submitted upstream, integrated into the upstream code base, and then re-imported into our code base.
+This development methodology is called "upstream first".
+History has shown repeatedly that an upstream first policy can help significantly with the long-term maintenance of code.
+
+However, strictly following an upstream first policy isn't great either.
+Some changes might not be useful for the upstream community, others might be not acceptable upstream or only applied after a long delay.
+In these situations it must be possible to modify the code downstream, i.e. in our repository, as well.
+Our setup includes multiple options to achieve this goal.
+In many cases, applying patches on top of the imported code is the most sustainable option.
+
+To ease the pain points of vendoring code we have developed tooling and continue to do so.
+Please open an issue ticket if you see areas where the tooling could be improved.
+
+## Basic concepts
+
+This section gives a quick overview how we include code from other repositories into our repository.
+
+All imported ("vendored") hardware code is by convention put into the `hw/vendor` directory.
+(We have more conventions for file and directory names which are discussed below when the import of new code is described.)
+To interact with code in this directory a tool called `vendor_hw` is used, which can be found in `util/vendor_hw.py`.
+A "vendor description file" controls the vendoring process and serves as input to the `vendor_hw` tool.
+
+In the simple, yet typical, case, the vendor description file is only a couple of lines of human-readable JSON:
+
+```command
+$ cat hw/vendor/lowrisc_ibex.vendor.hjson
+{
+ name: "lowrisc_ibex",
+ target_dir: "lowrisc_ibex",
+
+ upstream: {
+ url: "https://github.com/lowRISC/ibex.git",
+ rev: "master",
+ },
+}
+```
+
+This description file essentially says:
+We vendor a component called "lowrisc_ibex" and place the code into the "lowrisc_ibex" directory (relative to the description file).
+The code comes from the master branch of the Git repository found at https://github.com/lowRISC/ibex.git.
+
+With this description file written, the `vendor_hw` tool can do its job.
+
+```command
+$ cd $REPO_TOP
+$ ./util/vendor_hw.py hw/vendor/lowrisc_ibex.vendor.hjson --verbose
+INFO: Cloning upstream repository https://github.com/lowRISC/ibex.git @ master
+INFO: Cloned at revision 7728b7b6f2318fb4078945570a55af31ee77537a
+INFO: Copying upstream sources to /home/philipp/src/opentitan/hw/vendor/lowrisc_ibex
+INFO: Changes since the last import:
+* Typo fix in muldiv: Reminder->Remainder (Stefan Wallentowitz)
+INFO: Wrote lock file /home/philipp/src/opentitan/hw/vendor/lowrisc_ibex.lock.hjson
+INFO: Import finished
+```
+
+Looking at the output, you might wonder: how did the `vendor_hw` tool know what changed since the last import?
+It knows because it records the commit hash of the last import in a file called the "lock file".
+This file can be found along the `.vendor.hjson` file, it's named `.lock.hjson`.
+
+In the example above, it looks roughly like this:
+
+```command
+$ cat hw/vendor/lowrisc_ibex.lock.hjson
+{
+ upstream:
+ {
+ url: https://github.com/lowRISC/ibex.git
+ rev: 7728b7b6f2318fb4078945570a55af31ee77537a
+ }
+}
+```
+
+The lock file should be committed together with the code itself to make the import step reproducible at any time.
+
+After running `vendor_hw`, the code in your local working copy is updated to the latest upstream version.
+Next is testing: run simulations, syntheses, or other tests to ensure that the new code works as expected.
+Once you're confident that the new code is good to be committed, do so using the normal Git commands.
+
+```command
+$ cd $REPO_TOP
+
+$ # Stage all files in the vendored directory
+$ git add -A hw/vendor/lowrisc_ibex
+
+$ # Stage the lock file as well
+$ git add hw/vendor/lowrisc_ibex.lock.hjson
+
+$ # Now commit everything. Don't forget to write a useful commit message!
+$ git commit
+```
+
+Instead of running `vendor_hw` first, and then manually creating a Git commit, you can also use the `--commit` flag.
+
+```command
+$ cd $REPO_TOP
+$ ./util/vendor_hw.py hw/vendor/lowrisc_ibex.vendor.hjson --verbose --commit
+```
+
+This command updates the "lowrisc_ibex" code, and creates a Git commit from it.
+
+Read on for a complete example how to efficiently update a vendored dependency, and how to make changes to such code.
+
+## Update vendored code in our repository
+
+A complete example to update a vendored dependency, commit its changes, and create a pull request from it, is given below.
+
+```command
+$ cd $REPO_TOP
+$ # Ensure a clean working directory
+$ git stash
+$ # Create a new branch for the pull request
+$ git checkout -b update-ibex-code upstream/master
+$ # Update lowrisc_ibex and create a commit
+$ ./util/vendor_hw.py hw/vendor/lowrisc_ibex.vendor.hjson --verbose --commit
+$ # Push the new branch to your fork
+$ git push origin update-ibex-code
+$ # Restore changes in working directory (if anything was stashed before)
+$ git stash pop
+```
+
+Now go to the GitHub web interface to open a Pull Request for the `update-ibex-code` branch.
+
+## How to modify vendored code (fix a bug, improve it)
+
+### Step 1: Get the vendored repository
+
+1. Open the vendor description file (`.vendor.hjson`) of the dependency you want to update and take note of the `url` and the `branch` in the `upstream` section.
+
+2. Clone the upstream repository and switch to the used branch:
+
+ ```command
+ $ # Go to your source directory (can be anywhere)
+ $ cd ~/src
+ $ # Clone the repository and switch the branch. Below is an example for ibex.
+ $ git clone https://github.com/lowRISC/ibex.git
+ $ cd ibex
+ $ git checkout master
+ ```
+
+After this step you're ready to make your modifications.
+You can do so *either* directly in the upstream repository, *or* start in the OpenTitan repository.
+
+### Step 2a: Make modifications in the upstream repository
+
+The easiest option is to modify the upstream repository directly as usual.
+
+### Step 2b: Make modifications in the OpenTitan repository
+
+Most changes to external code are motivated by our own needs.
+Modifying the external code directly in the `hw/vendor` directory is therefore a sensible starting point.
+
+1. Make your changes in the OpenTitan repository. Do not commit them.
+
+2. Create a patch with your changes. The example below uses `lowrisc_ibex`.
+
+ ```command
+ $ cd hw/vendor/lowrisc_ibex
+ $ git diff --relative . > changes.patch
+ ```
+
+3. Take note of the revision of the imported repository from the lock file.
+ ```command
+ $ cat hw/vendor/lowrisc_ibex.lock.hjson | grep rev
+ rev: 7728b7b6f2318fb4078945570a55af31ee77537a
+ ```
+
+4. Switch to the checked out upstream repository and bring it into the same state as the imported repository.
+ Again, the example below uses ibex, adjust as needed.
+
+ ```command
+ # Change to the upstream repository
+ $ cd ~/src/ibex
+
+ $ # Create a new branch for your patch
+ $ # Use the revision you determined in the previous step!
+ $ git checkout -b modify-ibex-somehow 7728b7b6f2318fb4078945570a55af31ee77537a
+ $ git apply -p1 < $REPO_BASE/hw/vendor/lowrisc_ibex/changes.patch
+
+ $ # Add and commit your changes as usual
+ $ # You can create multiple commits with git add -p and committing
+ $ # multiple times.
+ $ git add -u
+ $ git commit
+ ```
+
+### Step 3: Get your changes accepted upstream
+
+You have now created a commit in the upstream repository.
+Before submitting your changes upstream, rebase them on top of the upstream development branch, typically `master`, and ensure that all tests pass.
+Now you need to follow the upstream guidelines on how to get the change accepted.
+In many cases their workflow is similar to ours: push your changes to a repository fork on your namespace, create a pull request, work through review comments, and update it until the change is accepted and merged.
+
+### Step 4: Update the vendored copy of the external dependency
+
+After your change is accepted upstream, you can update our copy of the code using the `vendor_hw` tool as described before.
+
+## How to vendor new code
+
+Vendoring external code is done by creating a vendor description file, and then running the `vendor_hw` tool.
+
+1. Create a vendor description file for the new dependency.
+ 1. Make note of the Git repository and the branch you want to vendor in.
+ 2. Choose a name for the external dependency.
+ It is recommended to use the format `<vendor>_<name>`.
+ Typically `<vendor>` is the lower-cased user or organization name on GitHub, and `<name>` is the lower-cased project name.
+ 3. Choose a target directory.
+ It is recommended use the dependency name as directory name.
+ 4. Create the vendor description file in `hw/vendor/<vendor>_<name>.vendor.hjson` with the following contents (adjust as needed):
+
+ ```
+ // Copyright lowRISC contributors.
+ // Licensed under the Apache License, Version 2.0, see LICENSE for details.
+ // SPDX-License-Identifier: Apache-2.0
+ {
+ name: "lowrisc_ibex",
+ target_dir: "lowrisc_ibex",
+
+ upstream: {
+ url: "https://github.com/lowRISC/ibex.git",
+ rev: "master",
+ },
+ }
+ ```
+
+2. Create a new branch for a subsequent pull request
+
+ ```command
+ $ git checkout -b vendor-something upstream/master
+ ```
+
+3. Commit the vendor description file
+
+ ```command
+ $ git add hw/vendor/<vendor>_<name>.vendor.hjson
+ $ git commit
+ ```
+
+4. Run the `vendor_hw` tool for the newly vendored code.
+
+ ```command
+ $ cd $REPO_TOP
+ $ ./util/vendor_hw.py hw/vendor/lowrisc_ibex.vendor.hjson --verbose --commit
+ ```
+
+5. Push the branch to your fork for review (assuming `origin` is the remote name of your fork).
+
+ ```command
+ $ git push -u origin vendor-something
+ ```
+
+ Now go the GitHub web interface to create a Pull Request for the newly created branch.
+
+## How to exclude some files from the upstream repository
+
+You can exclude files from the upstream code base by listing them in the vendor description file under `exclude_from_upstream`.
+Glob-style wildcards are supported (`*`, `?`, etc.), as known from shells.
+
+Example:
+
+```
+// section of a .vendor.hjson file
+exclude_from_upstream: [
+ // exclude all *.h files in the src directory
+ "src/*.h*",
+ // exclude the src_files.yml file
+ "src_files.yml",
+ // exclude some_directory and all files below it
+ "some_directory",
+]
+```
+
+## How to add patches on top of the imported code
+
+In some cases the upstream code must be modified before it can be used.
+For this purpose, the `vendor_hw` tool can apply patches on top of imported code.
+The patches are kept as separate files in our repository, making it easy to understand the differences to the upstream code, and to switch the upstream code to a newer version.
+
+To apply patches on top of vendored code, do the following:
+
+1. Extend the `.vendor.hjson` file of the dependency and add a `patch_dir` line pointing to a directory of patch files.
+ It is recommended to place patches into the `patches/<vendor>_<name>` directory.
+
+ ```
+ patch_dir: "patches/lowrisc_ibex",
+ ```
+
+2. Place patch files with a `.patch` suffix in the `patch_dir`.
+
+3. When running `vendor_hw`, patches are applied on top of the imported code according to the following rules.
+
+ - Patches are applied alphabetical order according to the filename.
+ Name patches like `0001-do-someting.patch` to apply them in a deterministic order.
+ - Patches are applied relative to the base directory of the imported code.
+ - The first directory component of the filename in a patch is stripped, i.e. they are applied with the `-p1` argument of `patch`.
+ - Patches are applied with `git apply`, making all extended features of Git patches available (e.g. renames).
+
+## How to manage patches in a Git repository
+
+Managing patch series on top of code can be challenging.
+As the underlying code changes, the patches need to be refreshed to continue to apply.
+Adding new patches is a very manual process.
+And so on.
+
+Fortunately, Git can be used to simplify this task.
+The idea:
+
+- Create a forked Git repository of the upstream code
+- Create a new branch in this fork.
+- Commit all your changes on top of the upstream code into this branch.
+- Convert all commits into patch files and store them where the `vendor_hw` tool can find and apply them.
+
+The last step is automated by the `vendor_hw` tool through its `--refresh-patches` argument.
+
+1. Modify the vendor description file to add a `patch_repo` section.
+ - The `url` parameter specifies the URL to the fork of the upstream repository containing all modifications.
+ - The `rev_base` is the base revision, typically the `master` branch.
+ - The `rev_patched` is the patched revision, typically the name of the branch with your changes.
+
+ ```
+ patch_repo: {
+ url: "https://github.com/lowRISC/riscv-dbg.git",
+ rev_base: "master",
+ rev_patched: "changes",
+ },
+ ```
+
+2. Create commit and push to the forked repository.
+ Make sure to push both branches to the fork: `rev_base` **and** `rev_patched`.
+ In the example above, this would be (with `REMOTE_NAME_FORK` being the remote name of the fork):
+
+ ```command
+ git push REMOTE_NAME_FORK master changes
+ ```
+
+3. Run the `vendor_hw` tool with the `--refresh-patches` argument.
+ It will first check out the patch repository and convert all commits which are in the `rev_patched` branch and not in the `rev_base` branch into patch files.
+ These patch files are then stored in the patch directory.
+ After that, the vendoring process continues as usual: all patches are applied and if instructed by the `--commit` flag, a commit is created.
+ This commit now also includes the updated patch files.
+
+To update the patches you can use all the usual Git tools in the forked repository.
+
+- Use `git rebase` to refresh them on top of changes in the upstream repository.
+- Add new patches with commits to the `rev_patched` fork.
+- Remove patches or reorder them with Git interactive rebase (`git rebase -i`).
+
+It is important to update and push *both* branches in the forked repository: the `rev_base` branch and the `rev_patched` branch.
+Use `git log rev_base..rev_patched` (replace `rev_base` and `rev_patched` as needed) to show all commits which will be turned into patches.