Skip to content

Commit

Permalink
Merge pull request #619 from volatilityfoundation/release/v2.0.0
Browse files Browse the repository at this point in the history
Release/v2.0.0
  • Loading branch information
ikelos authored Jan 12, 2022
2 parents 8ecc7df + d469d9c commit 13cb292
Show file tree
Hide file tree
Showing 228 changed files with 11,269 additions and 4,514 deletions.
15 changes: 15 additions & 0 deletions API_CHANGES.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
API Changes
===========

When an addition to the existing API is made, the minor version is bumped.
When an API feature or function is removed or changed, the major version is bumped.


1.2.0
=====
* Added support for module collections
* Added context.modules
* Added ModuleRequirement
* Added get\_symbols\_by\_absolute\_location


37 changes: 25 additions & 12 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Volatility 3: The volatile memory extraction framework

Volatility is the worlds most widely used framework for extracting digital
Volatility is the world's most widely used framework for extracting digital
artifacts from volatile memory (RAM) samples. The extraction techniques are
performed completely independent of the system being investigated but offer
visibility into the runtime state of the system. The framework is intended
Expand All @@ -14,21 +14,34 @@ technical and performance challenges associated with the original
code base that became apparent over the previous 10 years. Another benefit
of the rewrite is that Volatility 3 could be released under a custom
license that was more aligned with the goals of the Volatility community,
the Volatility Software License (VSL). See the [LICENSE](LICENSE.txt) file for more details.
the Volatility Software License (VSL). See the
[LICENSE](https://www.volatilityfoundation.org/license/vsl-v1.0) file for
more details.

## Requirements

- Python 3.5.3 or later. <https://www.python.org>
- Pefile 2017.8.1 or later. <https://pypi.org/project/pefile/>
Volatility 3 requires Python 3.6.0 or later. To install the most minimal set of dependencies (some plugins will not work) use a command such as:

## Optional Dependencies
```shell
pip3 install -r requirements-minimal.txt
```

Alternately, the minimal packages will be installed automatically when Volatility 3 is installed using setup.py. However, as noted in the Quick Start section below, Volatility 3 does not *need* to be installed via setup.py prior to using it.

- yara-python 3.8.0 or later. <https://github.com/VirusTotal/yara-python>
- capstone 3.0.0 or later. <https://www.capstone-engine.org/download.html>
```shell
python3 setup.py build
python3 setup.py install
```

To enable the full range of Volatility 3 functionality, use a command like the one below. For partial functionality, comment out any unnecessary packages in [requirements.txt](requirements.txt) prior to running the command.

```shell
pip3 install -r requirements.txt
```

## Downloading Volatility

The latest stable version of Volatility will always be the master branch of the GitHub repository. You can get the latest version of the code using the following command:
The latest stable version of Volatility will always be the stable branch of the GitHub repository. You can get the latest version of the code using the following command:

```shell
git clone https://github.com/volatilityfoundation/volatility3.git
Expand All @@ -45,7 +58,7 @@ git clone https://github.com/volatilityfoundation/volatility3.git
2. See available options:

```shell
python3 vol.py h
python3 vol.py -h
```

3. To get more information on a Windows memory sample and to make sure
Expand All @@ -55,10 +68,10 @@ Volatility supports that sample type, run
Example:

```shell
python3 vol.py f /home/user/samples/stuxnet.vmem windows.info
python3 vol.py -f /home/user/samples/stuxnet.vmem windows.info
```

4. Run some other plugins. The `-f` or `-single-location` is not strictly
4. Run some other plugins. The `-f` or `--single-location` is not strictly
required, but most plugins expect a single sample. Some also
require/accept other options. Run `python3 vol.py <plugin> -h`
for more information on a particular command.
Expand Down Expand Up @@ -91,7 +104,7 @@ The latest generated copy of the documentation can be found at: <https://volatil

## Licensing and Copyright

Copyright (C) 2007-2020 Volatility Foundation
Copyright (C) 2007-2022 Volatility Foundation

All Rights Reserved

Expand Down
77 changes: 77 additions & 0 deletions development/banner_server.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
import argparse
import base64
import json
import logging
import os
import pathlib
import urllib

from volatility3.cli import PrintedProgress
from volatility3.framework import contexts, constants
from volatility3.framework.automagic import linux, mac

vollog = logging.getLogger(__name__)


class BannerCacheGenerator:

def __init__(self, path: str, url_prefix: str):
self._path = path
self._url_prefix = url_prefix

def convert_url(self, url):
parsed = urllib.parse.urlparse(url)

relpath = os.path.relpath(parsed.path, os.path.abspath(self._path))

return urllib.parse.urljoin(self._url_prefix, relpath)

def run(self):
context = contexts.Context()
json_output = {'version': 1}

path = self._path
filename = '*'

for banner_cache in [linux.LinuxBannerCache, mac.MacBannerCache]:
sub_path = banner_cache.os
potentials = []
for extension in constants.ISF_EXTENSIONS:
# Hopefully these will not be large lists, otherwise this might be slow
try:
for found in pathlib.Path(path).joinpath(sub_path).resolve().rglob(filename + extension):
potentials.append(found.as_uri())
except FileNotFoundError:
# If there's no linux symbols, don't cry about it
pass

new_banners = banner_cache.read_new_banners(context, 'BannerServer', potentials, banner_cache.symbol_name,
banner_cache.os, progress_callback = PrintedProgress())
result_banners = {}
for new_banner in new_banners:
# Only accept file schemes
value = [self.convert_url(url) for url in new_banners[new_banner] if
urllib.parse.urlparse(url).scheme == 'file']
if value and new_banner:
# Convert files into URLs
result_banners[str(base64.b64encode(new_banner), 'latin-1')] = value

json_output[banner_cache.os] = result_banners

output_path = os.path.join(self._path, 'banners.json')
with open(output_path, 'w') as fp:
vollog.warning(f"Banners file written to {output_path}")
json.dump(json_output, fp)


if __name__ == '__main__':

parser = argparse.ArgumentParser()
parser.add_argument('--path', default = os.path.dirname(__file__))
parser.add_argument('--urlprefix', help = 'Web prefix that will eventually serve the ISF files',
default = 'http://localhost/symbols')

args = parser.parse_args()

bcg = BannerCacheGenerator(args.path, args.urlprefix)
bcg.run()
16 changes: 8 additions & 8 deletions development/compare-vol.py
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ def create_results(self, plugin: VolatilityPlugin, image: VolatilityImage, image
self.create_prerequisites(plugin, image, image_hash)

# Volatility 2 Test
print("[*] Testing {} {} with image {}".format(self.short_name, plugin.name, image.filepath))
print(f"[*] Testing {self.short_name} {plugin.name} with image {image.filepath}")
os.chdir(self.path)
cmd = self.plugin_cmd(plugin, image)
start_time = time.perf_counter()
Expand All @@ -56,15 +56,15 @@ def create_results(self, plugin: VolatilityPlugin, image: VolatilityImage, image
completed = excp
end_time = time.perf_counter()
total_time = end_time - start_time
print(" Tested {} {} with image {}: {}".format(self.short_name, plugin.name, image.filepath, total_time))
print(f" Tested {self.short_name} {plugin.name} with image {image.filepath}: {total_time}")
with open(
os.path.join(self.output_directory, '{}_{}_{}_stdout'.format(self.short_name, plugin.name, image_hash)),
os.path.join(self.output_directory, f'{self.short_name}_{plugin.name}_{image_hash}_stdout'),
"wb") as f:
f.write(completed.stdout)
if completed.stderr:
with open(
os.path.join(self.output_directory, '{}_{}_{}_stderr'.format(self.short_name, plugin.name,
image_hash)), "wb") as f:
os.path.join(self.output_directory, f'{self.short_name}_{plugin.name}_{image_hash}_stderr'),
"wb") as f:
f.write(completed.stderr)
return [total_time]

Expand All @@ -91,15 +91,15 @@ def create_results(self, plugin: VolatilityPlugin, image: VolatilityImage, image
def create_prerequisites(self, plugin: VolatilityPlugin, image: VolatilityImage, image_hash):
# Volatility 2 image info
if not image.vol2_profile:
print("[*] Testing {} imageinfo with image {}".format(self.short_name, image.filepath))
print(f"[*] Testing {self.short_name} imageinfo with image {image.filepath}")
os.chdir(self.path)
cmd = ["python2", "-u", "vol.py", "-f", image.filepath, "imageinfo"]
start_time = time.perf_counter()
vol2_completed = subprocess.run(cmd, cwd = self.path, capture_output = True)
end_time = time.perf_counter()
image.vol2_imageinfo_time = end_time - start_time
print(" Tested volatility2 imageinfo with image {}: {}".format(image.filepath, end_time - start_time))
with open(os.path.join(self.output_directory, 'vol2_imageinfo_{}_stdout'.format(image_hash)), "wb") as f:
print(f" Tested volatility2 imageinfo with image {image.filepath}: {end_time - start_time}")
with open(os.path.join(self.output_directory, f'vol2_imageinfo_{image_hash}_stdout'), "wb") as f:
f.write(vol2_completed.stdout)
image.vol2_profile = re.search(b"Suggested Profile\(s\) : ([^,]+)", vol2_completed.stdout)[1]

Expand Down
7 changes: 4 additions & 3 deletions development/mac-kdk/extract_kernel.sh
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,8 @@ ${DWARF2JSON}
popd
rm -fr tmp

${DWARF2JSON} mac --macho "${UNPACK_DIR}/${KERNEL_DIR}/kernel.dSYM" --macho-symbols "${UNPACK_DIRECTORY}/${KERNEL_DIR}/kernel" | xz -9 > ${JSON_DIR}/${KERNEL_DIR}.json.xz
if [ $? == 0 ]; then
${DWARF2JSON} mac --arch i386 --macho "${UNPACK_DIR}/${KERNEL_DIR}/kernel.dSYM" --macho-symbols "${UNPACK_DIRECTORY}/${KERNEL_DIR}/kernel" | xz -9 > ${JSON_DIR}/${KERNEL_DIR}.json.xz
echo "Running ${DWARF2JSON} mac --macho "${UNPACK_DIR}/${KERNEL_DIR}/kernel.dSYM" --macho-symbols "${UNPACK_DIR}/${KERNEL_DIR}/kernel" | xz -9 > ${JSON_DIR}/${KERNEL_DIR}.json.xz"
${DWARF2JSON} mac --macho "${UNPACK_DIR}/${KERNEL_DIR}/kernel.dSYM" --macho-symbols "${UNPACK_DIR}/${KERNEL_DIR}/kernel" | xz -9 > ${JSON_DIR}/${KERNEL_DIR}.json.xz
if [ $? != 0 ]; then
${DWARF2JSON} mac --arch i386 --macho "${UNPACK_DIR}/${KERNEL_DIR}/kernel.dSYM" --macho-symbols "${UNPACK_DIR}/${KERNEL_DIR}/kernel" | xz -9 > ${JSON_DIR}/${KERNEL_DIR}.json.xz
fi
18 changes: 9 additions & 9 deletions development/pdbparse-to-json.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,17 +27,17 @@ def retreive_pdb(self, guid: str, file_name: str) -> Optional[str]:
logger.info("Download PDB file...")
file_name = ".".join(file_name.split(".")[:-1] + ['pdb'])
for sym_url in ['http://msdl.microsoft.com/download/symbols']:
url = sym_url + "/{}/{}/".format(file_name, guid)
url = sym_url + f"/{file_name}/{guid}/"

result = None
for suffix in [file_name[:-1] + '_', file_name]:
try:
logger.debug("Attempting to retrieve {}".format(url + suffix))
logger.debug(f"Attempting to retrieve {url + suffix}")
result, _ = request.urlretrieve(url + suffix)
except request.HTTPError as excp:
logger.debug("Failed with {}".format(excp))
logger.debug(f"Failed with {excp}")
if result:
logger.debug("Successfully written to {}".format(result))
logger.debug(f"Successfully written to {result}")
break
return result

Expand Down Expand Up @@ -116,7 +116,7 @@ def __init__(self, filename: str):
self._filename = filename
logger.info("Parsing PDB...")
self._pdb = pdbparse.parse(filename)
self._seen_ctypes = set([]) # type: Set[str]
self._seen_ctypes: Set[str] = set([])

def lookup_ctype(self, ctype: str) -> str:
self._seen_ctypes.add(ctype)
Expand Down Expand Up @@ -169,7 +169,7 @@ def generate_metadata(self) -> Dict[str, Any]:
def read_enums(self) -> Dict:
"""Reads the Enumerations from the PDB file"""
logger.info("Reading enums...")
output = {} # type: Dict[str, Any]
output: Dict[str, Any] = {}
stream = self._pdb.STREAM_TPI
for type_index in stream.types:
user_type = stream.types[type_index]
Expand Down Expand Up @@ -231,7 +231,7 @@ def read_usertypes(self) -> Dict:

def _format_usertype(self, usertype, kind) -> Dict:
"""Produces a single usertype"""
fields = {} # type: Dict[str, Dict[str, Any]]
fields: Dict[str, Dict[str, Any]] = {}
[fields.update(self._format_field(s)) for s in usertype.fieldlist.substructs]
return {usertype.name: {'fields': fields, 'kind': kind, 'size': usertype.size}}

Expand All @@ -257,7 +257,7 @@ def _determine_size(self, field):
if output is None:
import pdb
pdb.set_trace()
raise ValueError("Unknown size for field: {}".format(field.name))
raise ValueError(f"Unknown size for field: {field.name}")
return output

def _format_kind(self, kind):
Expand Down Expand Up @@ -355,6 +355,6 @@ def read_basetypes(self) -> Dict:
json.dump(convertor.read_pdb(), f, indent = 2, sort_keys = True)

if args.keep:
print("Temporary PDB file: {}".format(filename))
print(f"Temporary PDB file: {filename}")
elif delfile:
os.remove(filename)
10 changes: 5 additions & 5 deletions development/schema_validate.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@
for filename in args.filenames:
try:
if os.path.exists(filename):
print("[?] Validating file: {}".format(filename))
print(f"[?] Validating file: {filename}")
with open(filename, 'r') as t:
test = json.load(t)

Expand All @@ -45,14 +45,14 @@
result = schemas.validate(test, False)

if result:
print("[+] Validation successful: {}".format(filename))
print(f"[+] Validation successful: {filename}")
else:
print("[-] Validation failed: {}".format(filename))
print(f"[-] Validation failed: {filename}")
failures.append(filename)
else:
print("[x] File not found: {}".format(filename))
print(f"[x] File not found: {filename}")
except Exception as e:
failures.append(filename)
print("[x] Exception occurred: {} ({})".format(filename, repr(e)))
print(f"[x] Exception occurred: {filename} ({repr(e)})")

print("Failures", failures)
16 changes: 8 additions & 8 deletions development/stock-linux-json.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ def download_lists(self, keep = False):
def download_list(self, urls: List[str]) -> Dict[str, str]:
processed_files = {}
for url in urls:
print(" - Downloading {}".format(url))
print(f" - Downloading {url}")
data = requests.get(url)
with tempfile.NamedTemporaryFile() as archivedata:
archivedata.write(data.content)
Expand All @@ -48,14 +48,14 @@ def process_rpm(self, archivedata) -> Optional[str]:
extracted = None
for member in rpm.getmembers():
if 'vmlinux' in member.name or 'System.map' in member.name:
print(" - Extracting {}".format(member.name))
print(f" - Extracting {member.name}")
extracted = rpm.extractfile(member)
break
if not member or not extracted:
return None
with tempfile.NamedTemporaryFile(delete = False,
prefix = 'vmlinux' if 'vmlinux' in member.name else 'System.map') as output:
print(" - Writing to {}".format(output.name))
print(f" - Writing to {output.name}")
output.write(extracted.read())
return output.name

Expand All @@ -65,14 +65,14 @@ def process_deb(self, archivedata) -> Optional[str]:
extracted = None
for member in deb.data.tgz().getmembers():
if member.name.endswith('vmlinux') or 'System.map' in member.name:
print(" - Extracting {}".format(member.name))
print(f" - Extracting {member.name}")
extracted = deb.data.get_file(member.name)
break
if not member or not extracted:
return None
with tempfile.NamedTemporaryFile(delete = False,
prefix = 'vmlinux' if 'vmlinux' in member.name else 'System.map') as output:
print(" - Writing to {}".format(output.name))
print(f" - Writing to {output.name}")
output.write(extracted.read())
return output.name

Expand All @@ -81,7 +81,7 @@ def process_files(self, named_files: Dict[str, str]):
print("Processing Files...")
for i in named_files:
if named_files[i] is None:
print("FAILURE: None encountered for {}".format(i))
print(f"FAILURE: None encountered for {i}")
return
args = [DWARF2JSON, 'linux']
output_filename = 'unknown-kernel.json'
Expand All @@ -91,10 +91,10 @@ def process_files(self, named_files: Dict[str, str]):
prefix = '--elf'
output_filename = './' + '-'.join((named_file.split('/')[-1]).split('-')[2:])[:-4] + '.json.xz'
args += [prefix, named_files[named_file]]
print(" - Running {}".format(args))
print(f" - Running {args}")
proc = subprocess.run(args, capture_output = True)

print(" - Writing to {}".format(output_filename))
print(f" - Writing to {output_filename}")
with lzma.open(output_filename, 'w') as f:
f.write(proc.stdout)

Expand Down
4 changes: 4 additions & 0 deletions doc/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
# These packages are required for building the documentation.
sphinx>=1.8.2
sphinx_autodoc_typehints>=1.4.0
sphinx-rtd-theme>=0.4.3
Loading

0 comments on commit 13cb292

Please sign in to comment.