User Documentation
Recaf is a modern Java and Android reverse engineering tool. This entails supporting many tasks such as providing an intuitive UI for navigating application logic, simplifying the process of modifying application behavior, and much more. You can read more about all of the other features provided by Recaf by reading the documentation.
Developer Documentation
Recaf is a modern Java and Android reverse engineering tool. This entails supporting many tasks such as providing an intuitive UI for navigating application logic, simplifying the process of modifying application behavior, and much more. You can read about the internals of Recaf, how to develop on top of them with plugins and scripts, and more by reading the documentation.
Getting Started
For plugin development
Most of what you need should be covered by:
- CDI
- This page talks about how injection is used in Recaf, which is used to access features within Recaf.
- Services
- This page links to the lists of different scoped services / features.
- Plugins
- This page talks about how to write plugins.
For contributing to Recaf directly
Contributing can cover a number of different tasks.
If you want to see what bugs need to be fixed and what features we want to add, visit the Recaf issue tracker on GitHub.
If you want to contribute documentation, visit the Recaf site project on GitHub.
Architecture
These articles talk about the overall design of Recaf.
Modules
Core
This portion of Recaf's source contains all of the back-end logic. This includes workspace modeling, services, and utilities.
UI
This portion of Recaf's source is JavaFX application that enables interaction with features from core
and api
.
Important libraries
A brief overview of the major dependencies Recaf uses in each module.
Core
JVM Bytecode Manipulation: Recaf uses ASM and CafeDude to parse bytecode. Most operations will be based on ASM since heavily abstracts away the class file format, making what would otherwise be tedious work simple. CafeDude is used for lower level operations and patching classes that are not compliant with ASM.
Android to Java Conversion: Recaf uses dex-translator to map the Dalvik bytecode of classes into JVM bytecode. This process is a bit lossy, but allows the use of JVM tooling (like the different decompilers) on Android content.
Android Dalvik Bytecode Manipulation: We are currently investigating on how to handle Dalvik manipulation.
ZIP Files: Recaf uses LL-Java-Zip to read ZIP files. The behavior of LL-Java-Zip is configurable and can mirror interpreting archives in different ways. This is important for Java reverse engineering since the JVM itself has some odd parsing quirks that most other libraries do not mirror. More information about this can be read on the LL-Java-Zip project page.
Source Parsing: Recaf uses OpenRewrite to parse Java source code. The major reasons for choosing this over other more mainstream parsing libraries are that:
- The AST model is error resilient. This is important since code Recaf is decompiling may not always yield perfectly correct Java code, especially with more intense forms of obfuscation. The ability to ignore invalid sections of the source while maintaining the ability to interact with recognizable portions is very valuable.
- The AST model bakes in the type, when known, to all AST nodes. For a node such as a method reference, you can easily access the name of the reference, the method descriptor of the reference, and the owning class defining the method. This information is what all of our context-sensitive actions must have access to in order to function properly.
- The AST supports easy source transformation options. In the past if a user wanted to remap a class or member, we would apply the mapping, decompile the mapped class, then replace the text contents with the new decompilation. This process can be slower on larger classes due to longer decompilation times. If we can skip that step and instead instantly transform the AST to update the text we can save a lot of time in these cases.
- The AST supports code formatting. We can allow the user to apply post-processing to decompiled code to give it a uniform style of their choosing, or allow them to format code to that style on demand with a keybind.
CDI: Recaf uses Weld as its CDI implementation. You can read the CDI article for more information.
UI
JavaFX: Recaf uses JavaFX as its UI framework. The observable property model it uses makes managing live updates to UI components when backing data is changed easy. Additionally, it is styled via CSS which makes customizing the UI for Recaf-specific operations much more simple as opposed to something like Swing.
AtlantaFX: Recaf uses AtlantaFX to handle common theming.
Ikonli: Recaf uses Ikonli for scalable icons. Specifically the Carbon pack.
Docking: Recaf uses Tiwul-FX's docking framework for handling dockable tabs across all open windows.
CDI
Recaf as an application is a CDI container. This facilitates dependency injection throughout the application.
Context before jumping into CDI
If you are unfamiliar with dependency injection (DI) and DI frameworks, watch this video. It covers example cases where using DI makes sense, and how DI frameworks are used. While the series the video belongs to is for Dagger, the ideas apply globally to all DI frameworks.
What is CDI though?
CDI is Contexts and Dependency Injection for Java EE. If that sounds confusing here's what that actually means in practice. When a class
implements one of Recaf's service interfaces we need a way to access that implementation so that the feature can be used. CDI uses annotations to determine when to allocate new instances of these implementations. The main three used in Recaf are the following:
@ApplicationScoped
: This implementation is lazily allocated once and used for the entire duration of the application.@WorkspaceScoped
: This implementation is lazily allocated once, but the value is then thrown out when a newWorkspace
is loaded. This way when the implementation is requested an instance linked to the current workspace is always given.@Dependent
: This implementation is not cached, so a new instance is provided every time upon request. You can think of it as being "scopeless".
When creating a class in Recaf, you can supply these implementations in a constructor that takes in parameters for all the needed types, and is annotated with @Inject
. This means you will not be using the constructor yourself. You will let CDI allocate it for you. Your new class can then also be used the same way via @Inject
annotated constructors.
What does CDI look like in Recaf?
Let's assume a simple case. We'll create an interface outlining some behavior, like compiling some code. We will create a single implementation class and mark it as @ApplicationScoped
since it is not associated with any specific state, like the current Recaf workspace.
interface Compiler {
byte[] build(String src);
}
@ApplicationScoped
class CompilerImpl implements Compiler {
@Override
Sbyte[] build(String src) { ... }
}
Then in our UI we can create a class that injects the base Compiler
type. We do not need to know any implementation details. Because we have only one implementation the CDI container knows the grab an instance of CompilerImpl
and pass it along to our constructor annotated with @Inject
.
@Dependent
class CompilerGui {
TextArea textEditor = ...
// There is only one implementation of 'Compiler' which is 'CompilerImpl'
@Inject CompilerGui(Compiler compiler) { this.compiler = compiler; }
// called when user wants to save (CTRL + S)
void onSaveRequest() {
byte[] code = compiler.build(textEditor.getText());
}
}
In this example, can I inject
Compiler
into multiple places?
Yes. Because the implementation CompilerImpl
is ApplicationScoped
the same instance will be used wherever you inject it into. Do recall, ApplicationScoped
essentially means the class is a singleton.
What happens if there are multiple implemetations of
Compiler
?
If you use @Inject CompilerGui(Compiler compiler)
with more than one available Compiler
implementation, the injection will throw an exception. You need to qualify which one you want to use. While CDI comes with the ability to use annotations to differentiate between implementations, it is best to create a new sub-class/interface for each implementation and then use those in your @Inject
constructor.
What if I want to inject or rquest a value later and not immediately in the constructor?
CDI comes with the type Instance<T>
which serves this purpose. It implements Supplier<T>
which allows you do to T value = instance.get()
.
@Dependent
class Foo {
// ...
}
@ApplicationScoped
class FooManager {
private final Instance<Foo> fooProvider;
@Inject
FooManager(Instance<Foo> fooProvider) {
// We do not request the creation of Foo yet.
this.fooProvider = fooProvider;
}
@Nonnull
T createFoo() {
// Now we request the creation of Foo.
// Since 'Foo' in this example is dependent, each returned value is a new instance.
return fooProvider.get();
}
}
What if I want multiple implementations? Can I get all of them at once?
Recaf has multiple decompiler implementations built in. Lets look at a simplified version of how that works. Instead of declaring a parameter of Decompiler
which takes one value, we use Instance<Decompiler>
which can be used both a producer of a single value and as an Iterable<T>
allowing us to loop over all known implementations of the Decompiler
interface.
@ApplicationScoped
class DecompileManager {
@Inject DecompileManager(Instance<Decompiler> implementations) {
for (Decompiler implementation : implementations)
registerDecompiler(implementation);
}
}
From here, we can define methods in DecompileManager
to manage which decompile we want to use. Then in the UI, we @Inject
this DecompileManager
and use that to interact with Decompiler
instances rather than directly doing so.
Can I mix what scopes I inject into a constructor?
I'd just like to point out, what you can and should do is not always a perfect match. As a general rule of thumb, what you inject as a parameter should be wider in scope than what the current class is defined as. Here's a table for reference.
I have a... | I want to inject a... | Should I do that? |
---|---|---|
ApplicationScoped class | ApplicationScoped parameter | :heavy_check_mark: Yes |
ApplicationScoped class | WorkspaceScoped parameter | :x: No |
ApplicationScoped class | Dependent parameter | :x: No |
WorkspaceScoped class | ApplicationScoped parameter | :heavy_check_mark: Yes |
WorkspaceScoped class | WorkspaceScoped parameter | :heavy_check_mark: Yes |
WorkspaceScoped class | Dependent parameter | :x: No |
Dependent class | ApplicationScoped parameter | :heavy_check_mark: Yes |
Dependent class | WorkspaceScoped parameter | :heavy_check_mark: Yes |
Dependent class | Dependent parameter | :heavy_check_mark: Yes |
This table is for directly injecting types. If you have a Dependent
type you can do Instance<Foo>
like in the example above.
What if I need a value dynamically, and getting values from the constructor isn't good enough?
Firstly, reconsider if you're designing things effectively if this is a problem for you. Recall that you can use Instance<T>
to essentially inject a producer of T
. But on the off chance that there is no real alt
In situations where providing values to constructors is not feasible, the Recaf
class provides methods for accessing CDI managed types.
Instance<T> instance(Class<T>)
: Gives you aSupplier<T>
/Iterable<T>
for the requested typeT
. You can useSupplier.get()
to grab a single instance ofT
, or useIterable.iterator()
to iterate over multiple instances ofT
if more than one implementation exists.T get(Class<T>)
: Gives you a single instance of the requested typeT
.
How do I know which scope to use when making new services?
Services that are effectively singletons will be @ApplicationScoped
.
Services that depend on the current content of a workspace will be @WorkspaceScoped
.
- In some cases, you may want to design a service as
@ApplicationScoped
and just pass in theWorkspace
as a method parameter. For instance, implementing a search. It needsWorkspace
access for sure, but the behavior is constant so it makes more sense to implement it this way as an@ApplicationScoped
type. - A strong case for
@WorkspaceScoped
are services that directly correlate with the contents of aWorkspace
. For instance, the inheritance graph service. The data it models will only ever be relevant to an active workspace. Having to pass in aWorkspace
every time would make implementing caching difficult.
Components acting only as views and wrappers to other components can mirror their dependencies' scope, or use @Dependent
since its not the view that really matters, but the data backing it.
Launching Recaf
When Recaf is launched, the Bootstrap
class is used to initialize an instance of Recaf
. The Bootstrap
class creates a CDI container that is configured to automatically discover implementations of the services outlined in the api
module. Once this process is completed, the newly made CDI container is wrapped in a Recaf
instance which lasts for the duration of the application.
Why are so many UI classes @Dependent
scoped?
There are multiple reasons.
1. On principle, they should not model/track data by themselves
For things like interactive controls that the user sees, they should not ever track data by themselves. If a control cannot be tossed in the garbage without adverse side effects, it is poorly designed. These controls provide visual access to the data within the Recaf instance (Like workspace contents), nothing more.
This is briefly mentioned before when discussing "how do I know which scope to use?".
2. CDI cannot create proxies of classes with final
methods, which UI classes often define
UI classes like JavaFX's Menu
often have methods marked as final
to prevent extensions to override certain behavior. The Menu
class's List<T> getItems()
is one of these methods. This prevents any Menu
type being marked as a scope like @ApplicationScoped
since our CDI implementation heavily relies on proxies for scope management. When a class is @Dependent
that means it effectively has no scope, so there is no need to wrap it in a proxy.
When are components created?
CDI instantiates components when they are first used. If you declare an @ApplicationScoped
component, but it is never used anywhere, it will never be initialized.
If you want or need something to be initialized immediately when Recaf launches add the extra annotation @EagerInitialization
. Any component that has this will be initialized at the moment defined by the value()
in EagerInitialization
. This annotation can be used in conjunction with @ApplicationScoped
or @WorkspaceScoped
.
There are two options:
IMMEDIATE
: The component is initialized as soon as possible.
- For
@ApplicationScoped
this occurs right after the CDI container is created. - For
@WorkspaceScoped
this occurs right after a workspace is opened.
AFTER_UI_INIT
: The component is initialized after the UI platform is initialized.
- For
@ApplicationScoped
this occurs right after the UI platform is initialized, as expected. - For
@WorkspaceScoped
this occurs right after a workspace is opened, with the assumption the UI is already initialized.
Be aware that any component annotated with this annotation forces all of its dependency components to also be initialized eagerly.
The workspace
Legend
- The primary resource is the input that is targeted for editing. If you drag and drop a single JAR file into Recaf, then this will represent that JAR file. The representation is broken down into pieces...
- The JVM class bundle contains all the
.class
files in the input that are not treated specially by the JVM. - JAR files allow you to have multiple versions of the same class for different JVM versions via "Multi-release JAR". This is a map of JVM version numbers to bundles of classes associated with that specific version.
- Android's APK files may contain multiple containers of classes known as DEX files. This is a mapping of each DEX file to the classes contained within it.
- The file bundle contains all other regular files that are not ZIP archives.
- ZIP archives are represented as embedded resources, allowing a ZIP in a ZIP, or JAR in a JAR, to be navigable within Recaf.
- Workspaces can have multiple inputs. These additional inputs can be used to enhance performance of some services such as inheritance graphing, recompilation, and SSVM virtualization just to name a few. These supporting resources are not intended to be editable and are just there to "support" services as described before.
- Recaf adds a few of its own supporting resources, but manages them separately from the supporting resources list.
- The runtime resource allows Recaf to access classes in the current JVM process like
java.lang.String
. - The android resource allows Recaf to access classes in the Android runtime. It is automatically loaded when a resource with DEX files is detected.
Creating workspaces
To create a Workspace
instance you will almost always be using the BasicWorkspace
implementation. You can pass along either:
- A single
WorkspaceResource
for the primary resource. - A single
WorkspaceResource
for the primary resource, plusCollection<WorkspaceResource>
for the supporting resources.
To create a WorkspaceResource
you can use the ResourceImporter
service, which allows you to read content from a variety of inputs.
Loading workspaces
There are multiple ways to load workspaces internally. Depending on your intent you'll want to do it differently.
For loading from Path
values in a UI context, use PathLoadingManager
. It will handle loading the content from the path in a background thread, and gives you a simple consumer type to handle IO problems.
Otherwise, you can use WorkspaceManager
directly to call setCurrent(Workspace)
.
Exporting workspaces
You can create an instance of WorkspaceExportOptions
and configure them to suite your needs. The options allow you to change:
- The compression scheme of contents.
MATCH_ORIGINAL
which will only compress items if they were originally compressed when read.SMART
which will only compress items if compression yields a smaller output than a non-compressed item. Very small files may become larger with compression due to the overhead of the compression scheme's dictionary.ALWAYS
which always compresses items.NEVER
which never compresses items.
- The output type, being a file or directory.
- The path to write to.
- The option to bundle contents of supporting resources into the exported output.
- The option to create ZIP file directory entries, if the output type is
FILE
. This creates empty entries in the output of ZIP/JAR files detailing directory paths. Some tools may use this data, but its not required for most circumstances.
The configured options instance can be re-used to export contents with the same configuration multiple times. To export a workspace do options.create()
to create a WorkspaceExporter
which then allows you to pass a Workspace
instance.
Listeners
The WorkspaceManager
allows you to register listeners for multiple workspace events.
WorkspaceOpenListener
: When a new workspace is opened within the manager.WorkspaceCloseListener
: When a prior workspace is closed within the manager.WorkspaceModificationListener
: When the active workspace's model is changed (Supporting resource added/removed)
When creating services and CDI enabled classes, you can annotate the class with @AutoRegisterWorkspaceListeners
to automatically register and unregister the class based on what is necessary for the CDI scope.
Accessing classes/files in the workspace
Classes and files reside within the WorkspaceResource
items in a Workspace
. You can access the resources directly like so:
// Content the user intends to edit
WorkspaceResource resource = workspace.getPrimaryResource();
// Content to support editing, but is not editable
List<WorkspaceResource> resources = workspace.getSupportingResources();
// All content in the workspace, which may include internally managed
// supporting resources if desired. Typically 'false'.
List<WorkspaceResource> resources = workspace.getAllResources(includeInternal);
As described in the workspace model above, resources have multiple "bundles" that contain content. The groups exist to facilitate modeling a variety of potential input types that Recaf supports. Bundles that represent classes share a common type ClassBundle
and then are broken down further into JvmClassBundle
and AndroidClassBundle
where relevant. Bundles that represent files are only ever FileBundle
.
// Contains JVM classes
JvmClassBundle bundle = resource.getJvmClassBundle();
// Contains JVM classes, grouped by the version of Java targeted by each class
NavigableMap<Integer, VersionedJvmClassBundle> bundles = resource.getVersionedJvmClassBundles();
// Contains Android classes, grouped by the name of each DEX file
Map<String, AndroidClassBundle> bundles = resource.getAndroidClassBundles();
// Contains files
FileBundle bundle = resource.getFileBundle();
// Contains files that represent archives, with a model of the archive contents
Map<String, WorkspaceFileResource> embeddedResources = resource.getEmbeddedResources();
These bundles are Map<String, T>
and Iterable<T>
where T
is the content type.
JvmClassBundle classBundle = resource.getJvmClassBundle();
FileBundle fileBundle = resource.getFileBundle();
// Get JVM class by name (remember to null check)
JvmClassInfo exampleClass = classBundle.get("com/example/Example");
// Looping over bundles
for (JvmClassInfo classInfo : classBundle)
...
for (FileInfo fileInfo : fileBundle)
...
// There are also stream operations to easily iterate over multiple bundles at once.
resource.classBundleStream()
.flatMap(Bundle::stream)
.forEach(classInfo -> {
// All classes in all bundles that hold 'ClassInfo' values
// including JvmClassBundle and AndroidClassBundle instances
});
Finding specific classes/files in the workspace
The Workspace
interface defines some find
operations allowing for simple name look-ups of classes and files.
Method | Usage |
---|---|
ClassPathNode findClass(String internalName) | Finds the first available ClassInfo by the given name, and wraps it in a ClassPathNode . |
ClassPathNode findJvmClass(String internalName) | Finds the first available JvmClassInfo by the given name, and wraps it in a ClassPathNode . |
ClassPathNode findLatestVersionedJvmClass(String internalName) | Finds the most up-to-date JvmClassInfo from all available versioned bundles, wrapping it in a ClassPathNode . |
ClassPathNode findVersionedJvmClass(String internalName, int version) | Finds the first available JvmClassInfo matching the given version (Floored to next available older version), and wraps it in a ClassPathNode |
ClassPathNode findAndroidClass(String internalName) | Finds the first available AndroidClassInfo by the given name, and wraps it in a ClassPathNode . |
DirectoryPathNode findPackage(String name) | Finds the first available ClassInfo defined in the given package, or any sub-package, then wraps the path in a DirectoryPathNode . |
SortedSet<ClassPathNode> findClasses(Predicate<ClassInfo> filter) | Collects all ClassInfo values in the workspace that match the given predicate, and wraps each in a ClassPathNode . The returned set ordering for paths is alphabetic order. |
SortedSet<ClassPathNode> findJvmClasses(Predicate<JvmClassInfo> filter) | Collects all JvmClassInfo values in the workspace that match the given predicate, and wraps each in a ClassPathNode . The returned set ordering for paths is alphabetic order. |
SortedSet<ClassPathNode> findVersionedJvmClasses(Predicate<JvmClassInfo> filter) | Collects all versioned JvmClassInfo values in the workspace that match the given predicate, and wraps each in a ClassPathNode . The returned set ordering for paths is alphabetic order. |
SortedSet<ClassPathNode> findAndroidClasses(Predicate<AndroidClassInfo> filter) | Collects all AndroidClassInfo values in the workspace that match the given predicate, and wraps each in a ClassPathNode . The returned set ordering for paths is alphabetic order. |
FilePathNode findFile(String filePath) | Finds any available FileInfo by the given name, and wraps it in a FilePathNode . |
SortedSet<FilePathNode> findFiles(Predicate<FileInfo> filter) | Collects all FileInfo values in the workspace that match the given predicate, and wraps each in a FilePathNode . The returned set ordering for paths is alphabetic order. |
Plugins & Scripts
- Plugins - Plugins are containers which can integrate with any of Recaf's services, can be loaded/unloaded, and are started when Recaf opens by default.
- Scripts - Scripts are small one-shot actions that can be as simple or complex as you want, so long as it all fits in a single Java source file. They're basically small plugins that are only run when the user requests them to.
Plugins
What is a plugin?
A plugin is a JAR file that contains one or more classes with exactly one of them implementing software.coley.recaf.plugin.Plugin
. When Recaf launches it looks in the plugins directory for JAR files that contain these plugin classes. It will then attempt to load and initialize them. Because a plugin is distributed as a JAR file a plugin developer can create complex logic and organize it easily across multiple classes in the JAR.
You can find a template project for creating plugins on GitHub at Recaf-Plugins/Recaf-4x-plugin-workspace.
Using services
Plugins can use services by annotating the class with @Dependent
and annotating the constructor with @Inject
. For services that are application scoped, see this example:
import jakarta.enterprise.context.Dependent;
import jakarta.inject.Inject;
import software.coley.recaf.plugin.*;
import software.coley.recaf.services.workspace.WorkspaceManager;
// Dependent is a CDI annotation which loosely translates to being un-scoped.
// Plugin instances are managed by Recaf so the scope is bound to when plugins are loaded in practice.
@Dependent
@PluginInformation(id = "##ID##", version = "##VERSION##", name = "##NAME##", description = "##DESC##")
class MyPlugin implements Plugin {
private final WorkspaceManager workspaceManager;
// Example, injecting the 'WorkspaceManager' service which is '@ApplicationScoped'
@Inject
public MyPlugin(WorkspaceManager workspaceManager) {
this.workspaceManager = workspaceManager;
}
@Override
public void onEnable() { ... }
@Override
public void onDisable() { ... }
}
For services that are @WorkspaceScoped
they will not be available on plugin initialization since no Workspace
will be set/opened when this occurs. You can use CDI constructs like Instance<T>
and use them as getters to work around this. Calling get()
on the instance after a Workspace
is opened will grab the workspace-scoped instance of the desired service. As an example here is a basic plugin that injects the InheritanceGraph
service:
import jakarta.enterprise.context.Dependent;
import jakarta.enterprise.inject.Instance;
import jakarta.inject.Inject;
import software.coley.recaf.plugin.*;
import software.coley.recaf.services.inheritance.InheritanceGraph;
import software.coley.recaf.services.workspace.WorkspaceManager;
@Dependent
@PluginInformation(id = "##ID##", version = "##VERSION##", name = "##NAME##", description = "##DESC##")
class MyPlugin implements Plugin {
// We will use the workspace manager to listen to when new workspaces are opened.
// When this occurs we can access instances of workspace scoped services.
@Inject
public MyPlugin(WorkspaceManager workspaceManager,
Instance<InheritanceGraph> graphProvider) {
// No workspace open, wait until one is opened by the user.
if (workspaceManager.getCurrent() == null) {
workspaceManager.addWorkspaceOpenListener(newWorkspace -> {
// This will be called AFTER the 'newWorkspace' value has been assigned
// as the 'current' workspace in the workspace manager.
// At this point, all workspace scoped services are re-allocated by CDI
// to target the newly opened workspace.
//
// Thus, we can get our inheritance graph of the workspace here.
InheritanceGraph graph = graphProvider.get();
});
} else {
// There is a workspace, so we can immediately get the graph for the current workspace.
InheritanceGraph graph = graphProvider.get();
}
}
@Override
public void onEnable() { ... }
@Override
public void onDisable() { ... }
}
For the list of available services, see the service lists.
CDI within Plugins
Plugins are capable of injecting Recaf's services in the plugin's constructor. Plugins themselves are only capable of being @Dependent
scoped and cannot declare any injectable components themselves. For instance, if you want to create a new JvmDecompiler
implementation that pulls in another Recaf service, you need to inject that service into the plugin's constructor and pass it to the JvmDecompiler
implementation manually.
The reason for this is that once the CDI container is initialized at startup it cannot be modified. We can inject new classes with services already in the container, but nothing new can be added at runtime.
Plugins and JavaFX
Plugins are loaded before JavaFX initializes. If your plugin has code that works with JavaFX classes or modifies Recaf's UI then you need to wrap that code in a FxThreadUtil.run(() -> { ... })
call.
You can still inject most UI services directly like MainMenuProvider
, but when calling its methods you have to be careful. There may be some services or injectable components that are a bit more finicky and will require Instance<ThingIWantToInject>
instead to work around this, where you call instance.get()
inside the FxThreadUtil.run(() -> { ... })
call to get the instance when JavaFX has been initialized.
Scripts
What is a script?
A script is a single Java source file that is executed by users whenever they choose. They can be written as a full class to support similar capabilities to plugins such as service injection, or in a short-hand that offers automatic imports of most common utility classes, but no access to services.
Full class script
A full class script is just a regular class that defines a non-static void run()
. The run()
method is called whenever the user executes the script.
// ==Metadata==
// @name Hello world
// @description Prints hello world
// @version 1.0.0
// @author Author
// ==/Metadata==
class MyScript {
// must define 'void run()'
void run() {
System.out.println("hello");
}
}
You can access any of Recaf's services by declaring a constructor annotated with @Inject
. More information on this is located further down the page.
Shorthand script
A shorthand script lets you write your logic without needing to declare a class and run()
method. These shorthand scripts are given a variable reference to the current workspace, and a SLF4J logger. You can access the current workspace as workspace
and the logger as log
.
// ==Metadata==
// @name What is open?
// @description Prints what kinda workspace is open
// @version 1.0.0
// @author Author
// ==/Metadata==
String name = "(empty)";
if (workspace != null)
name = workspace.getClass().getSimpleName();
log.info("Workspace = {}", name);
Another example working with the provided workspace
:
// Print out all enum names in the current workspace, if one is open.
if (workspace == null) return;
workspace.findClasses(Accessed::hasEnumModifier).stream()
.map(c -> c.getValue().getName())
.forEach(name -> log.info("Enum: {}", name));
Using services
Scripts are ran when a user requests them, so you generally do not need to care about whether a service is @ApplicationScoped
or @WorkspaceScoped
. The assumption is the user will run the script when it is needed. So a script that uses workspace-scoped content will only be used when a workspace is opened. Of course, if the script is going to load a new workspace, then you will need to follow the same process as described for plugins when using workspace scoped services.
Simple scripts do not use services. Scripts using the full class form will be able to use services.
// ==Metadata==
// @name Content loader
// @description Script to load content from a pre-set path.
// @version 1.0.0
// @author Col-E
// ==/Metadata==
import jakarta.enterprise.context.Dependent;
import jakarta.inject.Inject;
import org.slf4j.Logger;
import software.coley.recaf.analytics.logging.Logging;
import software.coley.recaf.info.JvmClassInfo;
import software.coley.recaf.services.workspace.WorkspaceManager;
import software.coley.recaf.services.workspace.io.ResourceImporter;
import software.coley.recaf.workspace.model.BasicWorkspace;
import software.coley.recaf.workspace.model.Workspace;
import software.coley.recaf.workspace.model.resource.WorkspaceResource;
import java.nio.file.Paths;
@Dependent
public class LoadContentScript {
private static final Logger logger = Logging.get("load-script");
private final ResourceImporter importer;
private final WorkspaceManager workspaceManager;
// We're injecting the importer to load 'WorkspaceResource' instances from paths on our system
// then we use the workspace manager to set the current workspace to the loaded content.
@Inject
public LoadContentScript(ResourceImporter importer, WorkspaceManager workspaceManager) {
this.importer = importer;
this.workspaceManager = workspaceManager;
}
// Scripts following the class model must define a 'void run()'
public void run() {
String path = "C:/Samples/Test.jar";
try {
// Load resource from path, wrap it in a basic workspace
WorkspaceResource resource = importer.importResource(Paths.get(path));
Workspace workspace = new BasicWorkspace(resource);
// Assign the workspace so the UI displays its content
workspaceManager.setCurrent(workspace);
} catch (Exception ex) {
logger.error("Failed to read content from '{}' - {}", path, ex.getMessage());
}
}
}
For the list of available services, see the service lists.
Services
As Recaf is driven by CDI, almost all of its features are defined as @Inject
-able service classes.
@ApplicationScoped
services: Any feature injectable regardless of whether or not aWorkspace
is currently open@WorkspaceScoped services
: Any feature injectable only when aWorkspace
is currently open
Application scoped services
These services are available for injection at any point.
API
These are the services defined in the core
module.
- AttachManager
- CallGraphService
- CommentManager
- ConfigManager
- DecompileManager
- GsonProvider
- InfoImporter
- InheritanceGraphService
- JavacCompiler
- MappingApplierService
- MappingFormatManager
- MappingGenerator
- MappingListeners
- NameGeneratorProviders
- PatchApplier
- PatchProvider
- PhantomGenerator
- PluginManager
- ResourceImporter
- ScriptEngine
- ScriptManager
- SearchService
- SnippetManager
- TransformationApplierService
- TransformationManager
- WorkspaceManager
- WorkspaceProcessingService
UI
The ui
module defines a number of new service types dedicated to UI behavior.
- Actions
- CellConfigurationService (Wraps these services)
- ContextMenuProviderService
- IconProviderService
- TextProviderService
- ConfigComponentManager
- ConfigIconManager
- FileTypeSyntaxAssociationService
- NavigationManager
- PathExportingManager
- PathLoadingManager
- ResourceSummaryService
- WindowFactory
- WindowManager
- WindowStyling
AttachManager
The attach manager allows you to:
- Inspect what JVM processes are running on the current machine
- Get value of
System.getProperties(
) of the remote JVM - Get the name of the remote JVM's starting main class (Entry point)
- Get JMX bean information from the remote JVM
- Register listeners for when new JVM processes start
- Get value of
- Connect to these JVM processes and represent their content as a
WorkspaceRemote
Inspecting available JVMs
The Attach API lists available JVM's via VirtualMachine.list()
. AttachManager
builds on top of that, offering easier access to each available JVM's properties, starting main class, and JMX beans.
By default, Recaf does not scan the system for running JVM's unless the attach window is open. The refresh rate for scans is once per second. When a new JVM is found Recaf queries it for the information listed above and caches the results. These operations typically involve a bit of tedious error handling and managing the connection state to the remote JVM, but now all you need is just a single call to one of AttachManager
's getters.
Example: Manual scans & printing discovered JVMs
// Register listener to print the number of new/old VM's on the system
// - Each parameter is Set<VirtualMachineDescriptor>
attachManager.addPostScanListener((added, removed) -> {
logger.info("Update: {} new instances, {} instances were closed",
added.size(), removed.size());
});
// Scan for new JVMs every second
Executors.newScheduledThreadPool(1)
.scheduleAtFixedRate(attachManager::scan, 0, 1, TimeUnit.SECONDS);
Example: Get information of a remote JVM
// The 'descriptor' is a VirtualMachineDescriptor, which we can listen for new values of
// by looking at the example above.
int pid = attachManager.getVirtualMachinePid(descriptor);
Properties remoteProperties = attachManager.getVirtualMachineProperties(descriptor);
String mainClass = attachManager.getVirtualMachineMainClass(descriptor);
Example: Access JMX bean information of a remote JVM
// Recaf has a wrapper type for the JMX connection which grants one-liner access to common beans.
JmxBeanServerConnection jmxConnection = attachManager.getJmxServerConnection(descriptor);
// Available beans
MBeanInfo beanClassLoading = jmxConnection.getClassloadingBeanInfo();
MBeanInfo beanCompilation = jmxConnection.getCompilationBeanInfo();
MBeanInfo beanOperatingSystem = jmxConnection.getOperatingSystemBeanInfo();
MBeanInfo beanRuntime = jmxConnection.getRuntimeBeanInfo();
MBeanInfo beanThread = jmxConnection.getThreadBeanInfo();
MBeanInfo beanOperatingSystem = jmxConnection.getOperatingSystemBeanInfo();
MBeanInfo beanOperatingSystem = jmxConnection.getOperatingSystemBeanInfo();
// Iterating over bean contents
MBeanAttributeInfo[] attributes = beanRuntime.getAttributes();
try {
for (MBeanAttributeInfo attribute : attributes) {
String name = attribute.getName();
String description = attribute.getDescription();
Object value = beanInfo.getAttributeValue(jmxConnection, attribute);
logger.info("{} : {} == {}", name, description, value);
}
} catch (Exception ex) {
logger.error("Failed to retrieve attribute values", ex);
}
Connecting to remote JVMs
To interact with remote JVMs instrumentation capabilities Recaf initializes a small TCP server in the remote JVM using Instrumentation Server. The WorkspaceRemoteVmResource
type wraps the client instance that interacts with this server. To connect to the remote server you need to call connect()
on the created WorkspaceRemoteVmResource
value.
// Once we create a remote resource, we call 'connect' to activate the remote server in the process.
// For 'WorkspaceRemoteVmResource' usage, see the section under the workspace model category.
WorkspaceRemoteVmResource vmResource = attachManager.createRemoteResource(descriptor);
vmResource.connect();
// Can set the current workspace to load the remote VM in the UI.
workspaceManager.setCurrent(new BasicWorkspace(vmResource));
CallGraphService
The CallGraphService
allows you to create CallGraph
instances, which model the flow of method calls through the workspace.
Getting a CallGraph instance
The CallGraph
type can be created for any arbitrary Workspace
. By default the graph will not populate until you call CallGraph#initialize
. This service will always keep a shared copy of the call graph for the current workspace.
// You can make a new graph from any workspace.
CallGraph graph = callGraphService.newCallGraph(workspace);
// Remember to call the "initialize" method.
graph.initialize();
// For large workspaces you may want to delegate this to run async and wait on the graph to be ready (see below).
CompletableFuture.runAsync(graph::initialize);
// Or get the current (shared) graph for the current workspace if one is open in the workspace manager.
// It will auto-initialize in the background. You will want to wait for the graph to be ready before use (see below).
graph = callGraphService.getCurrentWorkspaceGraph(); // Can be 'null' if no workspace is open.
Graph readiness
The call graph is populated in the background when a workspace is loaded and is not immediately ready for use. Before you attempt to use graph operations check the value of the graph's ObservableBoolean isReady()
method. You can register a listener on the ObservableBoolean
to operate immediately once it is ready.
ObservableBoolean ready = graph.isReady();
// Use a listener to wait until the graph is ready for use
ready.addChangeListener((ob, old, current) -> {
if (current) {
// do things
}
});
// Or use a sleep loop, or any other blocking mechanism
while (!ready.getValue()) {
Thread.sleep(100);
}
Getting an entry point
The graph is has its vertices bundled by which class defines each method. So to get your entry-point vertex in the graph you need the JvmClassInfo
reference of the class defining the method you want to look at.
// Given this example
class Foo {
public static void main(String[] args) { System.out.println("hello"); }
}
// Get the class reference
ClassPathNode clsPath = workspace.findJvmClass("com/example/Foo");
if (clsPath == null) return;
JvmClassInfo cls = clsPath.getValue().asJvmClass();
// Get the methods container for the class
ClassMethodsContainer containerMain = graph.getClassMethodsContainer(cls);
// Get the method in the container by name/descriptor
MethodVertex mainVertex = containerMain.getVertex("main", "([Ljava/lang/String;)V");
Navigating the graph
The MethodVertex
has methods for:
- Giving current information about the method definition
MethodRef getMethod()
- HoldsString
values outlining the declared method and its defining class. Always present.MethodMember getResolvedMethod()
- Holds workspace references to the declared method, may benull
if the declaring class type and/or method couldn't be found in the workspace.
- Getting methods that this definition calls to
Collection<MethodVertex> getCalls()
- Getting methods that call this definition
Collection<MethodVertex> getCallers()
Following the example from before, we should see that the only call from main
is PrintStream.println(String)
.
for (MethodVertex call : mainVertex.getCalls()) { // Only one value
MethodRef ref = call.getMethod();
String owner = ref.owner(); // Will be 'java/io/PrintStream'
String name = ref.name(); // Will be 'println'
String desc = ref.desc(); // Will be '(Ljava/lang/String;)V'
}
Similarly if you looked up the vertex for PrintStream.println(String)
and checked getCallers()
you would find Foo.main(String[])
.
CommentManager
The comment manager allows you to:
- Access all comments within a
Workspace
viaWorkspaceComments
- Look up comment information on a class, field or method via
PathNode
- Iterate over commented classes as
ClassComments
- Add, update, or remove comments utilizing
ClassComments
- Look up comment information on a class, field or method via
- Create listeners which are notified when:
- New comment containers (per class) are created
- Comments are updated
Comments are displayed in the UI by injecting them into the decompilation process. Before a class is sent to the decompiler, it is intercepted and unique annotations are inserted at commented locations. After the decompiler yields output the unique annotations are replaced with the comments they are associated with. The comments are stored in the Recaf directory, no modifications are made to the contents of the Workspace
when adding/editing/removing comments.
Comments added to items in a workspace are stored externally from the workspace in the Recaf directory.
Getting the desired WorkspaceComments
Recaf can store comments across multiple Workspace
instances. So how do you get the comments from the one you want?
Assuming Recaf has a Workspace
already opened and you want the comments of the current workspace:
// Will be 'null' if no workspace is open in Recaf
WorkspaceComments workspaceComments = commentManager.getCurrentWorkspaceComments();
If you have a Workspace
instance you want to get the comments of:
Workspace workspace = ... // Get or create a workspace here
WorkspaceComments workspaceComments = commentManager.getOrCreateWorkspaceComments(workspace);
Note: Comments in relationship to a
Workspace
are stored by a unique identifier associated with theWorkspace
. The identifier is pulled from the workspace's primary resource.
WorkspaceResource
instances loaded from file paths use the file path as the unique ID.WorkspaceResource
instances loaded from directory paths use the directory path as the unique ID.WorkspaceResource
instances loaded from a remove agent connection use the remote VM's identifier as the unique ID.
Getting an existing comment
For classes:
ClassPathNode classPath = workspace.findClass("com/example/Foo");
// Option 1: Lookup via generic PathNode
String comment = workspaceComments.getComment(classPath);
// Option 2: Lookup via ClassPathNode
String comment = workspaceComments.getClassComment(classPath);
// Option 3: Lookup the container of the class, then its comment
ClassComments classComments = workspaceComments.getClassComments(classPath);
if (classComments != null)
String comment = classComments.getClassComment();
For fields and methods:
ClassPathNode classPath = workspace.findClass("com/example/Foo");
ClassMemberPathNode memberPath = preMappingPath.child("stringField", "Ljava/lang/String;");
// Option 1: Lookup via generic PathNode
String comment = workspaceComments.getComment(memberPath);
// Option 2: Lookup the container of the class, then its comment
ClassComments classComments = workspaceComments.getClassComments(classPath);
if (classComments != null) {
String comment = classComments.getMethodComment(memberPath.getValue());
// You can also pass strings for the name/descriptor
String comment = getMethodComment("stringField", "Ljava/lang/String;");
}
Getting all existing comments
A ClassComments
by itself does not expose a reference to the ClassPathNode
it is associated with. The instances created by the comment manager during runtime do keep a reference though, so using instanceof DelegatingClassComments
will allow you to get the associated ClassPathNode
and in turn iterate over the class's fields and methods.
WorkspaceComments workspaceComments = commentManager.getCurrentWorkspaceComments();
for (ClassComments classComments : workspaceComments) {
String classComment = classComments.getClassComment();
// This subtype of class comment container records the associated class path.
if (classComments instanceof DelegatingClassComments delegatingClassComments) {
ClassPathNode classPath = delegatingClassComments.getPath();
// We can iterate over the fields/methods held by the path's class
ClassInfo classInfo = classPath.getValue();
for (FieldMember field : classInfo.getFields()) {
String fieldComment = classComments.getFieldComment(field);
}
for (MethodMember method : classInfo.getMethods()) {
String fieldComment = classComments.getMethodComment(method);
}
}
}
Adding / editing / removing comments
Adding and modifying comments is done by set
methods on a ClassComments
instance. Comments can be removed by passing null
as the comment parameter.
ClassPathNode classPath = workspace.findClass("com/example/Foo");
ClassComments classComments = workspaceComments.getOrCreateClassComments(classPath);
// Adding a comment to the class
classComments.setClassComment("class comment");
// Removing the comment
classComments.setClassComment(null);
// Adding a comment to a field in the class
classComments.setFieldComment("stringField", "Ljava/lang/String;", "Field comment");
// Adding a comment to a method in the class
classComments.setMethodComment("getStringField", "()Ljava/lang/String;", "Method comment");
Listening for new comments
If you want to track where comments are being made, you can register a CommentUpdateListener
:
commentManager.addCommentListener(new CommentUpdateListener() {
// Only implementing the class comment method, the same idea can
// be applied to the field and method listener calls.
@Override
public void onClassCommentUpdated(@Nonnull ClassPathNode path, @Nullable String comment) {
if (comment == null)
logger.info("Comment removed on class '{}'", path.getValue().getName());
else
logger.info("Comment updated on class '{}'", path.getValue().getName());
}
});
ConfigManager
The config manager allows you to:
- Iterate over all
ConfigContainer
instances across Recaf - Register and unregister your own
ConfigContainer
values- Useful for plugin developers who want to expose config values in the UI
- Register and unregister listeners which are notified when new
ConfigContainer
values are registered and unregistered.
Iterating over registered containers
for (ConfigContainer container : configManager.getContainers())
logger.info("Container group={}, id={}", container.getGroup(), container.getId());
Registering and unregistering new containers
To add content to the config window create a ConfigContainer
instance with some ConfigValue
values and register it in the config manager. The config window is configured to listen to new containers and add them to the UI.
// If you have your own class to represent config values,
// you will probably want to extend from 'BasicConfigContainer' and add values
// in the class's constructor via 'addValue(ConfigValue);'
// You can reference most existing config classes for examples of this setup.
ConfigContainer container = ...
configManager.registerContainer(container);
configManager.unregisterContainer(container);
When creating a ConfigContainer
class, it generally would be easiest to extend BasicConfigContainer
and then use the addValue(ConfigValue)
method.
public class MyThingConfig extends BasicConfigContainer {
private final ObservableString value = new ObservableString(null);
@Inject
public MyConfig() {
// Third party plugins should use 'EXTERNAL' as their group.
// This special group is treated differently in the config window UI,
// such that the ID's specified are text literals, and not translation keys.
super(ConfigGroups.EXTERNAL, "My thing config");
// Add values
addValue(new BasicConfigValue<>("My value", String.class, value));
}
public ObservableString getValue() {
return value;
}
}
Internal services within Recaf define their configs as ApplicationScoped
so that they are discoverable by the manager when the manager is initialized. This allows all services to feed their configs into the system when the application launches.
Listening for new containers
configManager.addManagedConfigListener(new ManagedConfigListener() {
@Override
public void onRegister(@Nonnull ConfigContainer container) {
logger.info("Registered container: {} with {} values",
container.getGroupAndId(), container.getValues().size());
}
@Override
public void onUnregister(@Nonnull ConfigContainer container) {
logger.info("Unregistered container: {} with {} values",
container.getGroupAndId(), container.getValues().size());
}
});
DecompileManager
The decompile manager allows you to:
- Decompile a
JvmClassInfo
orAndroidClassInfo
with:- The current preferred JVM or Android decompiler
- A provided decompiler
- Register and unregister additional decompilers
- Pre-process classes before they are decompiled
- Post-process output from decompilers
Using the decompile calls in this manager will schedule the tasks in a shared thread-pool. Calling the decompile methods on the Decompiler
instances directly is a blocking operation. If you want to decompile many items it would be best to take advantage of the manager due to the pool usage. Additionally, decompiling via the manager will facilitate the caching of decompilation results and globally specified filters.
Choosing a decompiler
There are a number of ways to grab a Decompiler
instance. The operations are the same for JVM and Android implementations.
// Currently configured target decompiler
JvmDecompiler decompiler = decompilerManager.getTargetJvmDecompiler();
// Specific decompiler by name
JvmDecompiler decompiler = decompilerManager.getJvmDecompiler("cfr");
// Any decompiler matching some condition, falling back to the target decompiler
JvmDecompiler decompiler = decompilerManager.getJvmDecompilers().stream()
.filter(d -> d.getName().equals("procyon"))
.findFirst().orElse(decompilerManager.getTargetJvmDecompiler());
Decompiling a class
If you want to pass a specific decompiler, get an instance and pass it to the decompile functions provided by DecompileManager
:
decompile(Workspace, JvmClassInfo)
- Uses the target decompiler (specified in the config)decompile(JvmDecompiler, Workspace, JvmClassInfo)
- Uses the specified decompiler passed to the method
JvmDecompiler decompiler = ...;
// Handle result when it's done.
decompilerManager.decompile(decompiler, workspace, classToDecompile)
.whenComplete((result, throwable) -> {
// Throwable thrown when unhandled exception occurs.
});
// Or, block until result is given, then handle it in the same thread.
// Though at this point, you should really just invoke the decompile method on the
// decompiler itself rather than incorrectly use the pooled methods provided by
// the decompile manager.
DecompileResult result = decompilerManager.decompile(decompiler, workspace, classToDecompile)
.get(1, TimeUnit.SECONDS);
Pre-processing decompiler input
The decompiler manager allows you to register filters for inputs fed to decompilers. This allows you to modify the contents of classes in any arbitrary way prior to decompilation, without actually making a change to the class in the workspace.
Here is an example where we strip out all debug information (Generics, local variable names, etc):
JvmBytecodeFilter debugRemovingFilter = (workspace, classInfo, bytecode) -> {
// The contents we return here will be what is fed to the decompiler, instead of the original class present in the workspace.
ClassWriter writer = new ClassWriter(0);
classInfo.getClassReader().accept(writer, ClassReader.SKIP_DEBUG);
return writer.toByteArray();
};
// The filter can be added/removed to/from all decompilers by using the decompile manager.
decompilerManager.addJvmBytecodeFilter(debugRemovingFilter);
decompilerManager.removeJvmBytecodeFilter(debugRemovingFilter);
// If you want to only apply the filter to one decompiler, you can do that as well.
decompiler.addJvmBytecodeFilter(debugRemovingFilter);
decompiler.removeJvmBytecodeFilter(debugRemovingFilter);
Post-processing decompiler output
Similar to pre-processing, the output of decompilation can be filtered via the decompiler manager's filters. They operate on the String
output of decompilers.
OutputTextFilter tabConversionFilter = (workspace, classInfo, code) -> {
return code.replace(" ", "\t");
};
// Add/remove to/from all decompilers
decompilerManager.addOutputTextFilter(tabConversionFilter);
decompilerManager.removeOutputTextFilter(tabConversionFilter);
// Add/remove to/from a single decompiler
decompiler.addOutputTextFilter(tabConversionFilter);
decompiler.removeOutputTextFilter(tabConversionFilter);
GsonProvider
The Gson provider manages a shared Gson
instance and allows services and plugins to register custom JSON serialization support. This serves multiple purposes:
- JSON formatting is shared across services that use it for serialization
- Services can register custom serialization for private types (See
KeybindingConfig
for an example of this) - Plugins that register custom config containers with
ConfigManager
can register support for their own custom types.
Registering custom serialization
The Gson provider offers methods that allows registering the following:
TypeAdapter
- For handling serialization and deserialization of a type.InstanceCreator
- For assisting construction of types that do not provide no-arg constructors.JsonSerializer
- For specifying serialization support only.JsonDeserializer
- For specifying deserialization support only.
For more details on how to use each of the given types, see the Gson JavaDocs.
TypeAdapter:
gsonProvider.addTypeAdapter(ExampleType.class, new TypeAdapter<>() {
@Override
public void write(JsonWriter out, ExampleType value) throws IOException {
// Manual serialization of 'value' goes here using 'out'
}
@Override
public ExampleType read(JsonReader in) throws IOException {
// Manual deserialization of the type from 'in' goes here
return new ExampleType(...);
}
});
InstanceCreator:
gsonProvider.addTypeInstanceCreator(ExampleType.class, type -> new ExampleType(...));
JsonSerializer:
gsonProvider.addTypeSerializer(ExampleType.class, (src, typeOfSrc, context) -> {
// Manual serialization of 'T src' into a 'JsonElement' return value goes here
return new JsonObject();
});
JsonDeserializer:
gsonProvider.addTypeDeserializer(ExampleType.class, (json, typeOfT, context) -> {
// Manual deserialization of (JsonElement json) goes here
return new ExampleType(...);
});
InfoImporter
The info importer can import a Info
from a ByteSource
. Since Info
defines isClass/asClass
and isFile/asFile
you can determine what sub-type the item is through those methods, or instanceof
checks.
Examples
Reading a class file:
// Wrap input in ByteSource.
byte[] classBytes = Files.readAllBytes(Paths.get("HelloWorld.class"));
ByteSource source = ByteSources.wrap(classBytes);
// Parse into an info object.
Info read = importer.readInfo("HelloWorld", source);
// Cast to 'JvmClassInfo' with 'asX' methods
JvmClassInfo classInfo = read.asClass().asJvmClass();
// Or use instanceof
if (read instanceof JvmClassInfo classInfo) {
// ...
}
Reading a generic file and doing an action based off the type of content it is (Like text/images/video/audio/etc):
// Wrap input in ByteSource.
byte[] textRaw = Files.readAllBytes(Paths.get("Unknown.dat"));
ByteSource source = ByteSources.wrap(textRaw);
// Parse into an info object.
Info read = importer.readInfo("Unknown.dat", source);
// Do action based on file type.
FileInfo readFile = read.asFile();
if (readFile.isTextFile()) {
TextFileInfo textFile = readFile.asTextFile();
String text = textFile.getText();
// ...
} else if (readFile.isVideoFile()) {
VideoFileInfo videoFile = readFile.asVideoFile();
// ...
}
InheritanceGraphService
The InheritanceGraphService
allows you to create InheritanceGraph
instances, which model the parent/child relations between classes and interfaces in the workspace.
Getting an InheritanceGraph instance
The InheritanceGraph
type can be created for any arbitrary Workspace
. This service will always keep a shared copy of the inheritance graph for the current workspace.
// You can make a new graph from any workspace.
InheritanceGraph graph = callGraphService.newInheritanceGraph(workspace);
// Or get the current (shared) graph for the current workspace if one is open in the workspace manager.
graph = callGraphService.getCurrentWorkspaceGraph(); // Can be 'null' if no workspace is open.
Parents and children
We will use the following classes for the following examples:
interface Edible {}
interface Red {}
class Apple implements Edible, Red {}
class AppleWithWorm extends Apple {}
class Grape implements Edible {}
classDiagram class Edible { <<interface>> } class Red { <<interface>> } class Apple { <<class>> } class AppleWithWorm { <<class>> } class Grape { <<class>> } Edible <|-- Grape Edible <|-- Apple Apple *-- AppleWithWorm Red <|-- Apple
Accessing parent types
You can access direct parents with getParents()
which returns a Set<InheritanceVertex>
, or parents()
which returns a Stream<InheritanceVertex>
. Direct parents include the class's super-type and any interfaces implemented directly by the class. For example AppleWithWorm
will implement Edible
and Red
but these are not direct parents since those are not declared on the class definition.
You can access all parents with getAllParents()
which returns a Set<InheritanceVertex>
, or allParents()
which returns a Stream<InheritanceVertex>
.
InheritanceVertex apple = graph.getVertex("Apple");
InheritanceVertex wormApple = graph.getVertex("AppleWithWorm");
InheritanceVertex red = graph.getVertex("Red");
// Get children as a set of graph vertices
// The set 'appleParents' will have 2 elements: Edible, Red
// The set 'wormAppleParents' will have 1 element: Apple
// The set 'wormAppleAllParents' will have 3 element: Apple, Edible, Red
// The set 'redParents' will be empty
Set<InheritanceVertex> appleParents = apple.getParents();
Set<InheritanceVertex> wormAppleParents = wormApple.getParents();
Set<InheritanceVertex> wormAppleAllParents = wormApple.getAllParents();
Set<InheritanceVertex> redParents = red.getParents();
// Alternative: Stream<InheritanceVertex>
wormApple.parents();
wormApple.allParents();
Accessing child types
You can access direct children with getChildren()
which returns a Set<InheritanceVertex>
, or children()
which returns a Stream<InheritanceVertex>
. Direct children are just the reverse order of direct parents as described above.
You can access all children with getAllChildren()
which returns a Set<InheritanceVertex>
, or allChildren()
which returns a Stream<InheritanceVertex>
.
InheritanceVertex apple = graph.getVertex("Apple");
InheritanceVertex wormApple = graph.getVertex("AppleWithWorm");
InheritanceVertex red = graph.getVertex("Red");
// Get children as a set of graph vertices
// The set 'appleChildren' will have 1 element: AppleWithWorm
// The set 'wormChildren' will be empty
// The set 'redChildren' will have 1 element: Apple
// The set 'redAllChildren' will have 2 elements: Apple, AppleWithWorm
Set<InheritanceVertex> appleChildren = apple.getChildren();
Set<InheritanceVertex> wormChildren = wormApple.getChildren();
Set<InheritanceVertex> redChildren = red.getChildren();
Set<InheritanceVertex> redAllChildren = red.getAllChildren();
// Alternative: Stream<InheritanceVertex>
apple.children();
apple.allChildren();
Accessing complete type hierarchy (parents and children)
You can access direct children & parents with getAllDirectVertices()
which combines the results of getChildren()
and getParents()
.
You can access all related vertices with getFamily(boolean includeObject)
which will is an recursive calling of getAllDirectVerticies()
. If you pass true
it will include all types that are not edge-cases described below in the edge-case section. You will probably only ever pass false
to getFamily(...)
.
// Direct will contain: Edible, Red, AppleWithWorm
Set<InheritanceVertex> appleDirects = apple.getAllDirectVertices();
// Family will contain: Edible, Red, AppleWithWorm, Apple (itself), Grape
// - Grape will be included because of the shared parent Edible
Set<InheritanceVertex> appleFamily = apple.getFamily(false);
Edge case: Classes without super-types
All classes must define a super-type. Each time you define a new class it will implictly extend java/lang/Object
unless it is an enum
which then it will extend java/lang/Enum
which extends java/lang/Object
. There are only a few exceptions to these rules.
Module classes, denoted by their name module-info
do not define super-types. Their super-type index in the class file points to index 0
which is an edge case treated as null
in this situation.
The Object
class also has no super-type, for obvious enough reasons.
The inheritance graph accommodates for these edge cases. It may be useful information for you to know regardless.
Edge case: Cyclic inheritance from obfuscators
Some obfuscators may create classes that are unused in the application logic, but exist solely to screw with analysis tools. Consider the following example:
class A extends B {}
class B extends A {}
classDiagram A *-- B B *-- A
This code will not compile, but there is nothing stopping an obfuscator from creating these classes. If an analysis tool naively tries to find all parents of A
it will look at B
then A
again, then B
and you have yourself an infinite loop.
The inheritance graph tracks what types in a hierarchy have already been visited and short-circuits hierarchy searches in paths where it finds cycles.
Getting the common type of two classes
You can get the common type of any two classes by passing their names to InheritanceGraph
's getCommon(String a, String b)
method.
// common will be 'Edible'
String common = graph.getCommon("Apple", "Grape");
classDiagram class Edible { <<interface>> } class Apple { <<class>> } class Grape { <<class>> } note for Edible "Common parent of Apple and Grape" Edible <|-- Apple Edible <|-- Grape
Checking if a type is assignable from another
Java's java.lang.Class
has an isAssignableFrom(Class)
method which the inheritance graph mirrors.
// Will return true since both Apple/Grape implement Edible
graph.isAssignableFrom("Edible", "Apple");
graph.isAssignableFrom("Edible", "Grape");
// The inverse Will return false
graph.isAssignableFrom("Apple", "Edible");
graph.isAssignableFrom("Grape", "Edible");
JavacCompiler
The javac
compiler is just a service wrapping the standard JDK javac
API with some quality of life improvements. All the arguments for the compiler are created through the JavacArgumentsBuilder
type.
Examples
The following samples can be used within this sample script:
@Dependent
public class CompilerScript {
private final JavacCompiler javac;
@Inject
public CompilerScript(JavacCompiler javac) {
this.javac = javac;
}
public void run() {
// code goes here
}
}
Compiling "Hello World"
The most common case, taking some source code and compiling it. You are required to specify the name of the class being compiled as an internal name (Example: com/example/Foo
) and the source. Any extra arguments are optional.
// Compile a 'hello world' application
JavacArguments arguments = new JavacArgumentsBuilder()
.withClassName("HelloWorld")
.withClassSource("""
public class HelloWorld {
public static void main(String[] args) {
System.out.println("Hello world");
}
}""")
.build();
// Run compiler and handle results
CompilerResult result = javac.compile(arguments, null, null);
if (result.wasSuccess()) {
CompileMap compilations = result.getCompilations();
compilations.forEach((name, bytecode) -> {
// Do something with name/bytecode pair
});
}
Handling compiler feedback/errors
Compiler feedback is accessible from the returned CompilerResult
as List<CompilerDiagnostic> getDiagnostics()
.
result.getDiagnostics().forEach(diagnostic -> {
if (diagnostic.level() != CompilerDiagnostic.Level.ERROR) return;
System.err.println(diagnostic.line() + ":" + diagnostic.column() + " --> " + diagnostic.message());
});
Changing the compiled bytecode target version
Adapting the setup from before, you can change the target bytecode version via withVersionTarget(int)
. This takes the release version of Java you wish to target. This is equivalent to javac --release N
where N
i the version. Because this uses the JDK environment you ran Recaf with the supported versions here are tied to what javac
supports.
// Compile a 'hello world' application against Java 11
int version = 11;
JavacArguments arguments = new JavacArgumentsBuilder()
.withVersionTarget(version)
.withClassName("HelloWorld")
.withClassSource("""
public class HelloWorld {
public static void main(String[] args) {
System.out.println("Hello world");
}
}""")
.build();
Downsampling the compiled bytecode instead of directly targeting it
Alternatively you may want to downsample compiled code instead of targeting that version from within the compiler. This allows you to use new language features while still targeting older versions of Java.
// Compile a 'hello world' application but downsample it to an older version
JavacArguments arguments = new JavacArgumentsBuilder()
.withDownsampleTarget(8) // Downsample to Java 8
.withClassName("HelloWorld")
.withClassSource("""
public class HelloWorld {
public static void main(String[] args) {
System.out.println(message());
}
private static String message() {
int r = new java.util.Random().nextInt(5);
// Using switch expressions, which do not exist in Java 8
return switch (r) {
case 0 -> "Zero";
case 1 -> "One";
case 2 -> "Two";
default -> "Three or more";
};
}
}""")
.build();
Compiling code with references to classes in the Workspace
All you need to do is call compile(JavacArguments arguments, Workspace workspace, JavacListener listener)
with a non-null Workspace
instance. This will automatically include it as a classpath entry, allowing you to compile code referencing types defined in the workspace.
There is also compile(JavacArguments arguments, Workspace workspace, List<WorkspaceResource> supplementaryResources, JavacListener listener)
which allows you to supply extra classpath data without adding it to the workspace.
Compiling code with debug info enabled
You can enable compiling with debug information by specifying true
to the debug with
operations in the arguments builder.
JavacArguments arguments = new JavacArgumentsBuilder()
.withDebugLineNumbers(true)
.withDebugSourceName(true)
.withDebugVariables(true)
Loading and executing the compiled code
The CompileMap
you get out of the CompilerResult
is an implementation of Map<String, byte[]>
. You can thus use the compilation map directly in a utility like ClassDefiner
. Using the hello world classes from the above examples:
CompileMap compilations = result.getCompilations();
ClassDefiner definer = new ClassDefiner(compilations);
try {
Class<?> helloWorld = definer.findClass("HelloWorld");
Method main = helloWorld.getDeclaredMethod("main", String[].class);
main.invoke(null, (Object) new String[0]);
} catch (Exception ex) {
ex.printStackTrace();
}
MappingApplierService
The mapping applier service takes in a Mappings
instance and applies it to a Workspace
, or a sub-set of specific classes in a Workspace
.
Targeting mappings in a workspace
Mappings can be applied to any Workspace
. You will either pass a workspace reference to the applier service, or use convenience method for an applier in the currently open workspace.
// Applier in an arbitrary workspace
Workspace workspace = ...
MappingApplier applier = mappingApplierService.inWorkspace(workspace);
// Applier in the current workspace (assuming one is open)
// If no workspace is open, this applier will be 'null'
MappingApplier applier = mappingApplierService.inCurrentWorkspace();
Mapping the whole workspace
To apply mappings to the workspace (affecting classes in the primary resource) pass any Mappings
of your choice to applyToPrimaryResource(Mappings)
.
Mappings mappings = ...
// Create the results containing the mapped output
MappingResults results = applier.applyToPrimaryResource(mappings);
Mapping specific classes
To apply mappings to just a few specific classes, use applyToClasses(Mappings, WorkspaceResource, JvmClassBundle, Collection<JvmClassInfo>)
.
// Example inputs
Mappings mappings = ...
WorkspaceResource resource = ...
JvmClassBundle bundle = ... // A bundle in the resource
List<JvmClassInfo> classesToMap = ... // Classes in the bundle
// Create the results containing the mapped output
MappingResults results = applier.applyToClasses(mappings, resource, bundle, classesToMap);
Operating on the results
The MappingsResults
you get from MappingsApplier
contains a summary of:
- The mappings used
- The classes that will be affected
- The paths to classes in their existing state
- The paths to classes in their post-mapping state
To apply the results call apply()
.
// Optional: Inspect the results
// =============================
// Names of affected classes: pre-mapped name --> post-mapped name
// - If the class was updated because it contains a mapped reference but was itself not mapped
// then the key/value are equal
Map<String, String> mappedClasses = results.getMappedClasses();
// Map of pre-mapped names to paths into the workspace of the class
Map<String, ClassPathNode> preMappingPaths = results.getPreMappingPaths();
// Map of post-mapped names to paths into the workspace at the location they
// would appear at after applying the results.
Map<String, ClassPathNode> postMappingPaths = results.getPostMappingPaths();
// Apply the mapping results (updates the workspace)
results.apply();
MappingFormatManager
The mapping format manager tracks recognized mapping formats, allowing you to:
- Iterate over the names of formats supported
- Create instances of
MappingFileFormat
based on the supported format name- Used to parse mapping files into Recaf's
IntermediateMappings
model, which can be used in a variety of other mapping APIs.
- Used to parse mapping files into Recaf's
- Register new mapping file formats
Iterating over recognized formats
To find out the types of mapping files Recaf supports, invoke getMappingFileFormats()
to get a Set<String>
of the supported file format names.
// The names in the returned set can be used to create instances of mapping file formats
Set<String> formats = mappingFormatManager.getMappingFileFormats();
Creating MappingFileFormat
instances
To create an instance of a MappingFileFormat
, invoke createFormatInstance(String)
with the name of the supported file format.
// Read contents of file
String fabricMappingContents = Files.readString(Paths.get("yarn-1.18.2+build.2-tiny"));
// Create the mapping format for some recognized type, then parse the mapping file's contents
MappingFileFormat format = mappingFormatManager.createFormatInstance("Tiny-V1");
IntermediateMappings parsedMappings = format.parse(fabricMappingContents);
// Do something with your parsed mappings
Registering new MappingFileFormat
instances
To register a new kind of mapping file format, invoke registerFormat(String, Supplier<MappingFileFormat>)
. The name argument should match the MappingFileFormat.implementationName()
of the registered format implementation.
Given this example implementation of a MappingFileFormat
:
import jakarta.annotation.Nonnull;
import software.coley.recaf.services.mapping.format.AbstractMappingFileFormat;
import software.coley.recaf.services.mapping.format.InvalidMappingException;
public class ExampleFormat extends AbstractMappingFileFormat {
public ExampleFormat() {
super("CustomFormat", /* supportFieldTypeDifferentiation */ true, /* supportVariableTypeDifferentiation */ true);
}
@Nonnull
@Override
public IntermediateMappings parse(@Nonnull String mappingsText) throws InvalidMappingException {
IntermediateMappings mappings = new IntermediateMappings();
String[] lines = mappingsText.split("\n");
for (String line : lines) {
// 0 1 2
// class obfuscated-name clean-name
if (line.startsWith("class\t")) {
String[] columns = line.split("\t");
String obfuscatedName = columns[1];
String cleanName = columns[2];
// Add class mapping to output
mappings.addClass(obfuscatedName, cleanName);
}
// 0 1 2 3 4
// member obfuscated-declaring-class obfuscated-name obfuscated-desc clean-name
if (line.startsWith("member\t")) {
String[] columns = line.split("\t");
String obfuscatedDeclaringClass = columns[1];
String obfuscatedDesc = columns[3]; // If 'supportFieldTypeDifferentiation == false' then this would be null
String obfuscatedName = columns[2];
String cleanName = columns[4];
// Add field mapping to output
if (obfuscatedDesc.charAt(0) == '(')
mappings.addMethod(obfuscatedDeclaringClass, obfuscatedDesc, obfuscatedName, cleanName);
else
mappings.addField(obfuscatedDeclaringClass, obfuscatedDesc, obfuscatedName, cleanName);
}
}
return mappings;
}
}
It should be registered with:
mappingFormatManager.registerFormat("CustomFormat", ExampleFormat::new);
MappingGenerator
The mapping generator allows you to generate Mappings
for a Workspace
based on configurable inputs:
- Filter what classes, fields, and methods should be included in the generated output mappings via a chain of
NameGeneratorFilter
items - Control the naming scheme of classes, fields, and methods via an implementation of
NameGenerator
Filtering what to generate mappings for
What we generate mappings for is controlled by a linked-list of NameGeneratorFilter
items. Each item in the chain can generalized to "include this" or "exclude this". Here is an example:
@Inject
StringPredicateProvider strMatchProvider;
// [Access blacklist 'public, protected'] --> [Class whitelist 'com/example/']
// Any non-public/protected class/field/method in 'com/example' will have a name generated
IncludeClassesFilter includeClasses = new IncludeClassesFilter(null /* tail of linked list */, strMatchProvider.newStartsWithPredicate("com/example/"));
ExcludeModifiersNameFilter excludePublicProtected = new ExcludeModifiersNameFilter(includeClasses, Arrays.asList(Opcodes.ACC_PUBLIC | Opcodes.ACC_PROTECTED), true, true, true);
// Use 'excludePublicProtected' as the 'NameGeneratorFilter' to pass as your filter - It is the head of the linked list.
You can use any of the existing NameGeneratorFilter
implementations in the software.coley.recaf.services.mapping.gen.filter
package, or make your own.
Controlling the naming scheme
There are a few simple implementations of NameGenerator
which can be used as-is, but for more advanced control you'll probably want to make your own. The interface outlines one method for naming each kind of item. Here is a simple implementation:
NameGenerator nameGenerator = new NameGenerator() {
@Nonnull
@Override
public String mapClass(@Nonnull ClassInfo info) {
return "mapped/Class" + Math.abs(info.getName().hashCode());
}
@Nonnull
@Override
public String mapField(@Nonnull ClassInfo owner, @Nonnull FieldMember field) {
return "mappedField" + Math.abs(owner.getName().hashCode() + info.getName().hashCode());
}
@Nonnull
@Override
public String mapMethod(@Nonnull ClassInfo owner, @Nonnull MethodMember method) {
return "mappedMethod" + Math.abs(owner.getName().hashCode() + info.getName().hashCode());
}
@Nonnull
@Override
public String mapVariable(@Nonnull ClassInfo owner, @Nonnull MethodMember declaringMethod, @Nonnull LocalVariable variable) {
return "mappedVar" + variable.getIndex();
}
};
Generating the output Mappings
Once you have a NameGenerator
and NameGeneratorFilter
pass them along to generate(Workspace, WorkspaceResource, InheritanceGraph, NameGenerator, NameGeneratorFilter)
. The method takes in the Workspace
and the WorkspaceResource
containing classes you want to generate mappings for. The WorkspaceResouerce
will almost always be the workspace's primary resource.
@Inject
InheritanceGraph inheritanceGraph; // You need the inheritance graph associated with the workspace.
Mappings mappings = mappingGenerator.generate(workspace, resource, inheritanceGraph, nameGenerator, filter);
MappingListeners
The mapping listeners service allows you to listen to when mappings are applied to any Workspace
.
Listening to mapping operations
Just call addMappingApplicationListener(MappingApplicationListener)
with your listener implementation.
Here is an example implementation with some comments explaining the contents of the mapping results model:
class ExampleMappingListener implements MappingApplicationListener {
@Override
public void onPreApply(@Nonnull MappingResults mappingResults) {
// The mappings that were used to create the results
Mappings mappings = mappingResults.getMappings();
// All names of classes *affected* by mappings can be iterated over.
//
// If a class was not renamed, but had contents inside it that point to renamed content
// it will be included in this map and the key/value will be equal.
// Otherwise, the post-map-name will be the class's renamed name.
mappingResults.getMappedClasses().forEach((preMapName, postMapName) -> {
ClassPathNode preMappingPath = mappingResults.getPreMappingPath(preMapName);
ClassPathNode postMappingPath = mappingResults.getPostMappingPath(postMapName);
// The 'results' model already has the contents of classes after mapping is applied.
// They just have not been copied back into the workspace yet.
ClassInfo preMappedClass = preMappingPath.getValue();
ClassInfo postMappedClass = postMappingPath.getValue();
});
}
@Override
public void onPostApply(@Nonnull MappingResults mappingResults) {
// The results model is the same as the 'onPreApply' but the workspace has now been
// updated to replace old classes with the updated instances.
}
}
NameGeneratorProviders
The NameGeneratorProviders
service allows you to:
- See which
NameGeneratorProvider
are available - Register your own
NameGeneratorProvider
Get current name providers
// Read-only map of generator ids to generator provider instances
Map<String, NameGeneratorProvider<?>> providers = nameGeneratorProviders.getProviders();
Registering a new NameGeneratorProvider
// AbstractNameGeneratorProvider implements most things for you.
// All that you need to do is pass a unique 'id' in the constructor and implement 'createGenerator()'
nameGeneratorProviders.registerProvider(new AbstractNameGeneratorProvider<>("my-unique-id") {
@Nonnull
@Override
public NameGenerator createGenerator() {
// Create your name generator here
}
});
PatchApplier
The patch applier applies WorkspacePatch
values to a given Workspace
.
Generating patches
See PatchProvider:
- For auto-creating patches based on changes made in an existing
Workspace
- For loading patches from JSON
You could also manually construct the WorkspacePatch
instance yourself.
Applying patches
// Optional feedback interface implementation for receiving details about patch failures.
// Can be 'null' to ignore feedback.
PatchFeedback feedback = new PatchFeedback() {
@Override
public void onAssemblerErrorsObserved(@Nonnull List<Error> errors) {
// assembler patch has failed, patch process abandoned
}
@Override
public void onIncompletePathObserved(@Nonnull PathNode<?> path) {
// patch had path that was invalid, patch process abandoned
}
};
// If the patch was applied, we return 'true'
// If errors were seen, the patch is abandoned and we return 'false'
boolean success = patchApplier.apply(patch, feedback);
PatchProvider
The patch provider facilitates the creation of WorkspacePatch
instances.
Generating patches from changes in a workspace
A patch that represents all the changes made to a workspace (Removing files, editing classes, etc) can be made by calling createPath(Workspace)
.
Workspace workspace = ...
// Some changes to the workspace are made...
// Generate a patch that represents the changes
WorkspacePatch patch = patchProvider.createPatch(workspace);
Reading/writing patches from JSON
Patches can be persisted to a JSON representation via serializePatch(WorkspacePatch)
and deserializePatch(Workspace, String)
.
// Given a 'WorkspacePatch' transform it into JSON.
String serializedJson = patchProvider.serializePatch(patch);
// Given some JSON transform it back into a patch.
// We pass along the workspace that this patch will be applied to.
WorkspacePatch deserializePatch = patchProvider.deserializePatch(workspace, serializedJson);
Applying patches
See PatchApplier
PathExportingManager
The path exporting manager facilitates exporting various workspace types to files, prompting the user to provide locations to save to
Exporting the current workspace
The currently open workspace can be exported to a user-provided path like so:
try { pathExportingManager.exportCurrent(); }
catch (IllegalStateException ex) { /* no workspace open */ }
Exporting a specific workspace
Any workspace instance can also be exported:
// Delegates to export(workspace, "workspace", true)
pathExportingManager.export(workspace);
// Prompts the user to:
// - Alert them if the workspace has no changes recorded in it (seeL alertWhenEmpty)
// - Provide a location to save the workspace to (if you loaded a jar, you should probably provide a location like "export.jar")
boolean alertWhenEmpty = false;
String description = "some files"; // Used in logger output so that we see "exported 'some files' to %PATH%"
pathExportingManager.export(workspace, description, alertWhenEmpty);
Exporting a specific class/file
Specific Info
types like JvmClassInfo
and FileInfo
can also be exported to user-provided paths:
pathExportingManager.export(classInfo);
pathExportingManager.export(fileInfo);
PathLoadingManager
The path loading manager controls loading Workspace
instances in the UI. It has the following capabilities:
- Register listeners that intercept the
java.nio.Path
locations to user inputs before theWorkspace
is constructed from the paths. - Asynchronously open a workspace from a
java.nio.Path
plus supporting resources from a list ofjava.nio.Path
values. - Asynchronously append supporting resources to a workspace from a list of
java.nio.Path
values.
Intercepting user input paths
Registering a listener can allow you to see what file paths a user is requesting to load into Recaf.
private final PathLoadingManager loadManager;
// ...
loadManager.addPreLoadListener((primaryPath, supportingPaths) -> {
if (supportingPaths.isEmpty())
logger.info("Loading workspace from {}", primaryPath);
else
logger.info("Loading workspace from {} + [{}]", primaryPath,
supportingPaths.stream().map(Path::toString).collect(Collectors.joining(", ")));
});
Loading workspace content into Recaf asynchronously
As opposed to directly using WorkspaceManager
this class handles things asynchronously since its intended for use in the UI. Here's how you can load a file as a workspace:
private final PathLoadingManager loadManager;\
// ...
Path path = Paths.get("input.jar");
List<Path> supportingPaths = Arrays.asList(Paths.get("library-1.jar"), Paths.get("library-2.jar"));
CompletableFuture<Workspace> future = loadManager.asyncNewWorkspace(path, supportingPaths,
error -> logger.warn("Failed to load from '{}'", path));
Adding to the current workspace:
Workspace workspace = workspaceManager.getCurrent();
List<Path> supportingPaths = Arrays.asList(Paths.get("library-1.jar"), Paths.get("library-2.jar"));
CompletableFuture<List<WorkspaceResource>> future = asyncAddSupportingResourcesToWorkspace(workspace, supportingPaths,
error -> logger.warn("Failed to append {}", supportingPaths.stream()
.map(Path::toString)
.collect(Collectors.joining(", "))));
PhantomGenerator
The phantom generator service allows you to create phantoms for:
- Entire workspaces at a time
- One or more specific classes from a workspace
Generating phantoms
For generating phantoms for all primary classes in a workspace:
@Inject
PhantomGenerator phantomGenerator;
@Inject
Workspace workspace; // Injected in this example to pull in the 'current' workspace, but it can be any arbitrary workspace
// Most common use case is to then append the phantoms to the workspace
// so they can be used to suplement other services operating off of the current workspace.
GeneratedPhantomWorkspaceResource phantomResource = phantomGenerator.createPhantomsForWorkspace(workspace);
workspace.addSupportingResource(phantomResource)
For generating phantoms for just a few classes:
List<JvmClassInfo> classes = workspace.findJvmClasses(c -> c.getName().startsWith("com/example")).stream()
.map(path -> path.getValue().asJvmClass())
.toList();
// Phantoms in the output will only be generated to satisfy missing references in the 'classes' we pass.
GeneratedPhantomWorkspaceResource phantomResource = phantomGenerator.createPhantomsForClasses(workspace, classes);
ResourceImporter
The resource importer can import a WorkspaceResource
from a variety of inputs such as:
ByteSource
- Delegates to some data-source providing content as abyte[]
File
- Can point to a file, or directoryPath
- Can point to a file, or directoryURL
- Can point to any content source that can be streamed from.URI
- Can point to any content source that can be streamed from.
Reading from different content types
When providing content from an in-memory source, ByteSource
can be used:
// Any content that can be represented in 'byte[]' can be wrapped into a 'ByteSource'
byte[] helloBytes = "Hello".getBytes(StandardCharsets.UTF_8);
ByteSource source = ByteSources.wrap(helloBytes);
WorkspaceResource resource = importer.importResource(source);
// The utility class 'ByteSources' has a number of helpful methods, example paths:
Path path = Paths.get("test.jar");
ByteSource source = ByteSources.forPath(path);
WorkspaceResource resource = importer.importResource(source);
// The utility class 'ZipCreationUtils' also may be useful if you want to easily
// bundle multiple items together into one source.
// It can make a ZIP from a Map<String, byte[]> or from individual items by using
// a builder pattern via 'ZipCreationUtils.builder()'.cocc
String name = "com/example/Demo";
byte[] bytes = ...
Map<String, byte[]> map = new LinkedHashMap<>();
map.put(name + ".class", bytes);
map.put(JarFileInfo.MULTI_RELEASE_PREFIX + "9/" + name + ".class", bytes);
map.put(JarFileInfo.MULTI_RELEASE_PREFIX + "10/" + name + ".class", bytes);
map.put(JarFileInfo.MULTI_RELEASE_PREFIX + "11/" + name + ".class", bytes);
byte[] zipBytes = ZipCreationUtils.createZip(map);
ByteSource zipSource = ByteSources.wrap(zipBytes);
WorkspaceResource resource = importer.importResource(zipSource);
When providing content from an on-disk source, using a File
or Path
reference can be used:
// NIO Path
Path path = Paths.get("test.jar");
WorkspaceResource resource = importer.importResource(path);
// Old IO File
File file = new File("test.jar");
WorkspaceResource resource = importer.importResource(file);
When providing content from a URL/URI, content can be either on-disk or remote. So long as streaming for the URL scheme is supported:
// URI from local file
URI uri = File.createTempFile("prefix", "test.zip").toURI();
WorkspaceResource resource = importer.importResource(uri);
// URL from a remote file
URL url = new URL("https://example.com/example.zip");
WorkspaceResource resource = importer.importResource(url);
ScriptEngine
The script engine is a service used to:
- Run single Java files as Recaf scripts.
- Compile single Java files, without running them, and yielding the
java.lang.Class
of the generated script.
Running scripts
Calling run(String)
will asynchronously compile and run the passed script contents. As documented in the scripting section, these can be as short or long as you desire. Here are some examples of varying complexity:
@Inject
ScriptEngine engine;
// A simple one-liner
engine.run("System.setProperty(\"foo\", \"bar\");").thenAccept(result -> {
if (result.wasSuccess())
System.out.println(System.getPropert("foo")); // Will print "bar"
});
// Recaf's script system allows you to also define full classes. Any method 'void run()' will be executed.
// It also supports injection of *any* of Recaf's services of *any* scope. If a workspace is currently
// active you can inject it or any workspace-scoped service.
// Check the scripting section for more information.
String code = """
public class Test implements Runnable {
@Inject
JavacCompiler compiler;
@Override
public void run() {
System.out.println("hello: " + compiler);
if (compiler == null) throw new IllegalStateException();
}
}
""";
engine.run(code).thenAccept(result -> {
// At this point we printed 'hello: JavacCompiler@71841' or whatever the instance hash is at the moment.
});
Compiling scripts
If you wish to get the java.lang.Class
of the generated script without immediately running it, you can use compile(String)
to asynchronously get the compiled class.
engine.compile("System.setProperty(\"foo\", \"bar\");").thenAccept(result -> {
if (result.wasSuccess()) {
Class<?> scriptClass = result.cls();
// Do what you want with the class
}
});
ScriptManager
The script manager tracks recognized scripts in the Recaf scripts directory. It can also be used to parse arbitrary java.nio.Path
items into ScriptFile
instances.
Local scripts
In the Recaf root directory a sub-directory named scripts
is watched for changes. Any files found in this directory will be checked for being valid scripts and recorded in this manager if they match. You can access these scripts and even listen for when scripts are added and removed via getScriptFiles()
which returns an ObservableCollection
of ScriptFile
s.
// Iterating over the currently known scripts
for (ScriptFile script : scriptManager.getScriptFiles()) {
// ...
}
// Listening for changes in local scripts
scriptManager.getScriptFiles().addChangeListener((ob, oldScriptList, newScriptList) -> {
// The files changed between the old and new list instances
List<ScriptFile> disjoint = Lists.disjoint(oldScriptList, newScriptList);
});
Reading files as scripts
To parse a script from a file path call read(Path)
:
ScriptFile script = scriptManager.read(Paths.get("Example.java"));
String content = script.source();
String metadataName = script.name(); // Meta-data not specified in the script file will yield an empty string
String metadataDesc = script.description();
String metadataVersion = script.version();
String metadataAuthor = script.author();
String metadataCustom = script.getTagValue("custom");
See the scripting section for more information about the contents of script files.
SearchService
The search service allows you to search workspaces:
- For strings, numbers, and references to classes and/or members
- With the ability to cancel the search early
- With the ability to control which classes and files are searched in
- With the ability to control what results are included in the final output
Query model
All searches are built from Query
instances. There are three types of queries:
AndroidClassQuery
(not yet implemented)JvmClassQuery
FileQuery
Each implementation creates a SearchVisitor
that handles searching of individual items in the Workspace
. Things like string, number, and reference searches all implement each of these query types that they are relevant for. For example the reference search only implements AndroidClassQuery
and JvmClassQuery
but string search implements all three since strings can appear in any of these places (classes and files).
Searching for common cases like strings, numbers, and references are already implemented as queries and take in predicates for matching content. The following examples assume the following services are injected:
@Inject
NumberPredicateProvider numMatchProvider;
@Inject
StringPredicateProvider strMatchProvider;
@Inject
SearchService searchService;
String querying
Results results = searchService.search(classesWorkspace, new StringQuery(strMatchProvider.newEqualPredicate("Hello world")));
Results results = searchService.search(classesWorkspace, new StringQuery(strMatchProvider.newStartsWithPredicate("Hello")));
Results results = searchService.search(classesWorkspace, new StringQuery(strMatchProvider.newEndsWithPredicate("world")));
All the available built-in predicates come from StringPredicateProvider
, or you can provide your own predicate implementation.
Number querying
Results results = searchService.search(classesWorkspace, new NumberQuery(numMatchProvider.newEqualsPredicate(4)));
Results results = searchService.search(classesWorkspace, new NumberQuery(numMatchProvider.newAnyOfPredicate(6, 32, 256)));
Results results = searchService.search(classesWorkspace, new NumberQuery(numMatchProvider.newRangePredicate(0, 10)));
All the available built-in predicates come from NumberPredicateProvider
, or you can provide your own predicate implementation.
Reference querying
Each aspect of a reference (declaring class, name, descriptor) are their own string predicates. You pass null
to any of these predicates to match anything for that given aspect. A simple example to find System.out.println()
calls would look like:
Results results = searchService.search(classesWorkspace, new ReferenceQuery(
strMatchProvider.newEqualPredicate("java/lang/System"), // declaring class predicate
strMatchProvider.newEqualPredicate("out"), // reference name predicate
strMatchProvider.newEqualPredicate("Ljava/io/PrintStream;") // reference descriptor predicate
));
If you want to find all references to a given package you could do something like this:
Results results = searchService.search(classesWorkspace, new ReferenceQuery(
strMatchProvider.newStartsWithPredicate("com/example/"),
null, // match any field/method name
null, // match any field/method descriptor
));
Feedback handler
Passing a feedback handler to the search
(...) methods allows you to control what classes and files are searched in by implementing the doVisitClass(ClassInfo)
and doVisitFile(FileInfo)
methods. Here is a basic example which limits the search to only classes in a given package:
// All methods in the feedback interface default to visit everything, and include all results.
// You can override the 'boolean visitX' methods to control the searching of content within the passed classes/files.
class SkipClassesInPackage implements SearchFeedback {
private final String pkg;
SkipClassesInPackage(String pkg) { this.pkg = pkg; }
@Override
public boolean doVisitClass(@Nonnull ClassInfo cls) {
// Skip if class does not exist in package
return !cls.getName().startsWith(pkg);
}
}
SearchFeedback skipping = new SkipClassesInPackage("com/example/");
To control the early abortion of a search you would implement hasRequestedCancellation()
to return true
after some point. A basic built in class can exists:
// There is a built-in cancellable search implementation.
CancellableSearchFeedback cancellable = new CancellableSearchFeedback();
// Aborts the current search that this feedback is associated with.
cancellable.cancel();
To limit which results are included in the final Results
of the search(...)
call, implementdoAcceptResult(Result<?>)
to return false
for results you want to discard. Since the Result
contains a PathNode
reference to where the match was made at, its probably what you'll want to operate on to implement your own filtering. Here is an example which limits the final Results
to include only one item per class:
// This is a silly example, but we really just want to show off how you'd implement this, not how to make a real-world implementation.
class OnlyOneResultPerClass implements SearchFeedback {
private Set<String> includedClassNames = new HashSet<>();
@Override
public boolean doAcceptResult(@Nonnull Result<?> result) {
PathNode<?> pathToValue = result.getPath();
// Get the class value in the path to the value.
// If the path points to something more specific like a instruction in a method, then
// this will be the class that defines te method with that instruction in it.
ClassInfo classPathValue = pathToValue.getValueOfType(ClassInfo.class);
if (classPathValue != null && !includedClassNames.add(classPathValue.getName())) {
// If we've already seen a result from this class, skip all the remaining results
// so that there is only one result per class.
return false;
}
// Keep the result in the output
return true;
}
}
SnippetManager
The snippet manager is used to store common snippets of assembler text that users can copy and paste into the assembler UI.
Snippets
Snippets are a simple record with three components:
name
- The name, also used as the key for snippet manager operations.description
- The optional explanation of what the snippet is for. If no such value is given this should be an empty string.content
- The actual snippet body
Getting current snippets
// Snapshot of existing snippets
List<Snippet> snippets = snippetManager.getSnippets();
// Getting a snippet by name
Snippet example = snippetManager.getByName("example");
Registering / unregistering snippets
// Create and register 'System.out.println("Hello")'
String content = """
getstatic java/lang/System.out Ljava/io/PrintStream;
ldc "hello"
invokevirtual java/io/PrintStream.println (Ljava/lang/String;)V
""";
snippetManager.putSnippet(new Snippet("hello", "prints 'hello'", content));
// Unregistering it
snippetManager.removeSnippet("hello");
Listening to the creation/removal/modification of snippets
SnippetListener listener = new SnippetListener() {
@Override
public void onSnippetAdded(@Nonnull Snippet snippet) {
System.out.println("NEW: " + snippet.name());
}
@Override
public void onSnippetModified(@Nonnull Snippet old, @Nonnull Snippet current) {
System.out.println("MOD: " + old.name());
}
@Override
public void onSnippetRemoved(@Nonnull Snippet snippet) {
System.out.println("DEL: " + snippet.name());
}
};
snippetManager.addSnippetListener(listener);
snippetManager.removeSnippetListener(listener);
TransformationApplierService
The transformation applier service allows you to take transformers registered in the TransformationManager
and apply them to the current workspace, or any arbitrary workspace.
Creating a transformation applier
// Will be null if there is no current workspace open
TransformationApplier applier = transformationApplierService.newApplierForCurrentWorkspace();
// Will never be 'null' so long as the workspace variable is also not 'null'
// - The downside is that dependent features like the inheritance graph (for frame computation needed in some transformers)
// will have to be generated each time you call this method since the applier will need access to that service for this workspace
// that is not the "current" workspace.
// - If you happen to pass the "current" workspace as a parameter then this will delegate to the
// more optimal 'newApplierForCurrentWorkspace()' method.
Workspace workspace = ...
TransformationApplier applier = transformationApplierService.newApplier(workspaqce);
TransformationManager
The transformation manager allows registering of different transformer types for class processing operations.
Example Transformer
public class MyTransformer implements JvmClassTransformer {
/** Optional, can delete if you do not need any one-time setup logic */
@Override
public void setup(@Nonnull JvmTransformerContext context, @Nonnull Workspace workspace) {
// Transformation setup here
// - Pulling values from the context
// - Checking contents of the workspace
// - Etc.
}
/** Used to transform the classes */
@Override
public void transform(@Nonnull JvmTransformerContext context, @Nonnull Workspace workspace,
@Nonnull WorkspaceResource resource, @Nonnull JvmClassBundle bundle,
@Nonnull JvmClassInfo classInfo) throws TransformationException {
if (exampleOfRawEdit) {
// IMPORTANT: Use this method to get the bytecode, and DO NOT use the direct 'classInfo.getBytecode()'!
// The context will store updated bytecode across multiple transformations, so if you use the direct
// bytecode from the 'ClassInfo' you risk losing all previous transform operations.
byte[] modifiedBytecode = context.getBytecode(bundle, classInfo);
// TODO: Make changes to 'modifiedBytecode'
// Save modified bytecode into the context
context.setBytecode(bundle, classInfo, modifiedBytecode);
} else if (exampleOfAsmTreeEdit) {
// IMPORTANT: The same note as above applies here, but with ASM's ClassNode.
ClassNode node = context.getNode(bundle, classInfo);
// TODO: Make changes to 'node'
// Save modified class-node (and its respective bytecode) into the context
context.setNode(bundle, classInfo, node);
}
}
/** Unique name of this transformer */
@Nonnull
@Override
public String name() {
return "My cool transformer";
}
/** Any dependencies this transformer relies on. Some transformers are used for analysis and store data
* that can be accessed later, and depending on those transformers ensures the data is accessible when
* this transformer is invoked. */
@Nonnull
@Override
public Set<Class<? extends JvmClassTransformer>> dependencies() {
// Optional method, you can delete this if you have no dependencies.
// But if you do use dependencies, you can get instances of them via 'context.getJvmTransformer(OtherTransformer.class)'
// in the 'setup' and 'transform' methods above.
return Collections.emptySet();
}
}
Registering transformers
// Registering and unregistering
processingService.registerJvmClassTransformer(MyTransformer.class, () -> new MyTransformer());
processingService.unregisterJvmClassTransformer(MyTransformer.class);
WorkspaceManager
The attach manager allows you to:
- Access the current workspace
- Set the current workspace
- Add listeners to be notified of:
- New workspaces being opened
- Workspaces being closed
- Changes to existing workspaces (The model, not the content) being made
Accessing the current workspace
The current workspace is accessed via getWorkspace()
.
Workspace workspace = workspaceManager.getWorkspace();
if (workspace != null) {
// ...
} else {
// No workspace open
}
This method is also annotated with @Produces
and @Dependent
which allows @Inject
to operate on other @Dependent
classes & scripts.
@Inject Constructor(Workspace workspace) {
if (workspace != null) {
// ...
} else {
// No workspace open
}
}
Setting the workspace
Assigning a workspace is done via setWorkspace(Workspace)
. You can "unset" or close a workspace by passing null
or by calling closeCurrent()
.
Workspace workspace = // ..
workspaceManager.setWorkspace(workspace);
// These two calls behave the same
workspaceManager.closeCurrent();
workspaceManager.setWorkspace(null);
In case the case where a WorkspaceCloseCondition
has been registered the request to close a workspace can be blocked. Consider that when you are using the GUI and you close a file you are asked "Are you sure?" before closing the workspace. To ensure any potential cause of closing the workspace is handled this is achieved by a registering a WorkspaceCloseCondition
in the UI which requires answering the prompt before allowing the close to occur.
While it is not recommended you can circumvent such conditions by using setCurrentIgnoringConditions(Workspace)
instead of setWorkspace(Workspace)
.
Listening for new workspaces
Register a WorkspaceOpenListener
.
workspaceManager.addWorkspaceOpenListener(workspace -> {
// Operate on newly opened workspace
});
Listening to workspace closures
Register a WorkspaceCloseListener
. Mostly useful for read-only handling such as logging.
workspaceManager.addWorkspaceCloseListener(workspace -> {
// Operate on closed workspace
});
Similarly you can have a WorkspaceCloseCondition
if you want to listen to and prevent workspace closures.
workspaceManager.addWorkspaceCloseCondition(workspace -> {
// Returning 'false' will prevent a workspace from being closed.
if (shouldPreventClosure(workspace)) return false;
return true;
});
Listening to workspace structure modifications
Normally you would add a WorkspaceModificationListener
on a specific Workspace
but in the WorkspaceManager
you can add a "default" WorkspaceModificationListener
which is added to all newly opened workspaces.
workspaceManager.addDefaultWorkspaceModificationListeners(new WorkspaceModificationListener() {
@Override
public void onAddLibrary(@Nonnull Workspace workspace, @Nonnull WorkspaceResource library) {
// Supporting library added to workspace
}
@Override
public void onRemoveLibrary(@Nonnull Workspace workspace, @Nonnull WorkspaceResource library) {
// Supporting library removed from workspace
}
});
WorkspaceProcessingService
The workspace processing service allows registering custom WorkspaceProcessor
. These processors take in a Workspace
parameter and can do anything. They are applied to any workspace that gets opened via the WorkspaceManager
.Generally these are intended to do lightweight operations such as the ThrowablePropertyAssigningProcessor
which adds the ThrowableProperty
to appropriate ClassInfo
values in the workspace. For things more along the lines of bytecode manipulation you will want to check out the TransformationManager
and TransformationApplierService
.
Registering processors
class MyProcessor implements WorkspaceProcessor {
@Override
public void processWorkspace(@Nonnull Workspace workspace) {
// Processing goes here
}
}
// Registering and unregistering
processingService.register(MyProcessor.class, () -> new MyProcessor());
processingService.unregister(MyProcessor.class);
Workspace scoped services
These services are available for injection only while a workspace is open. When a workspace is closed you will not be able to inject instances of these types until the next workspace is opened.
Core
These are the services defined in the core
module.
- AggregateMappingManager
- AssemblerPipelineManager
- AstService
- ConfigIconManager
- ConfigComponentManager
UI
These are no new services defined in the ui
module.
AggregateMappingManager
The aggregate mapping manager maintains an AggregatedMappings
instance (which extends IntermediateMappings
) representing the sum of all mapping operations applied to the current workspace.
Getting the aggregate mappings
To do feature A you do XYZ, here is a sample.
@Inject AggregateMappingManager aggManager;
// Get the mappings on-demand
AggregatedMappings mappings = aggManager.getAggregatedMappings();
// Register a listener to be notified of changes
aggManager.addAggregatedMappingsListener(mappings -> {
// Called when any workspace mapping are applied
});
Utilities
There are quite a few utility classes in Recaf that all serve as independent units or holders of various static utility methods.
Core util groups
UI util groups
- None
Android
A collection of android utilities.
- AndroidRes
- AndroidXmlUtil
- DexIOUtil
ASM Visitors
A collection of ASM visitors for various class transformations.
- AnnotationArrayVisitor
- BogusNameRemovingVisitor
- ClassAnnotationInsertingVisitor
- ClassAnnotationRemovingVisitor
- ClassHollowingVisitor
- DuplicateAnnotationRemovingVisitor
- FieldAnnotationInsertingVisitor
- FieldAnnotationRemovingVisitor
- FieldInsertingVisitor
- FieldReplacingVisitor
- IllegalAnnotationRemovingVisitor
- IllegalSignatureRemovingVisitor
- IndexCountingMethodVisitor
- LongAnnotationRemovingVisitor
- MemberCopyingVisitor
- MemberFilteringVisitor
- MemberRemovingVisitor
- MemberStubAddingVisitor
- MethodAnnotationInsertingVisitor
- MethodAnnotationRemovingVisitor
- MethodInsertingVisitor
- MethodNoopingVisitor
- MethodReplacingVisitor
- SignatureRemovingVisitor
- SyntheticRemovingVisitor
- VariableRemovingClassVisitor
- VariableRemovingMethodVisitor
IO
A collection of IO utilities.
- ByteSource
- ByteArraySource
- ByteBufferSource
- LocalFileHeaderSource
- MemorySegmentDataSource
- PathByteSource
- ByteSourceConsumer
- ByteSourceElement
- ByteSources
Threading
A collection of threading utilities.
- CountDown
- ExecutorServiceDelegate
- PhasingExecutorService
- ScheduledExecutorServiceDelegate
- ThreadPoolFactory
- ThreadUtil
Misc
A collection of unsorted utilities.
- AccessFlag
- AccessPatcher
- BlwUtil
- ByteHeaderUtil
- CancelSignal
- ClassDefiner
- ClassLoaderInternals
- ClasspathUtil
- CollectionUtil
- DesktopUtil
- DevDetection
- EscapeUtil
- Handles
- InternalPath
- IOUtil
- JavaVersion
- JigsawUtil
- Keywords
- LookupUtil
- MemoizedFunctions
- ModulesIOUtil
- MultiMap
- MultiMapBuilder
- NumberUtil
- PlatformType
- ReflectUtil
- RegexUtil
- ResourceUtil
- SelfReferenceUtil
- ShortcutUtil
- Streams
- StringDiff
- StringUtil
- TestEnvironment
- Types
- UnsafeIO
- UnsafeUtil
- ZipCreationUtils
ClassDefiner
The ClassDefiner
is a utility extending ClassLoader
which takes in one or more classes as a Map<String, byte[]>
and defines them at runtime.
- Keys are the class name format you'd use for
Class.forName(String)
and thus would look likejava.lang.String
. - Values are the raw bytes of the class file.
Usage
String name = "com/example/StringMapper"; // Internal name format
String sourceName = name.replace('/', '.'); // Source name format
// Example code to generate an interface
ClassWriter cw = new ClassWriter(0);
cw.visit(V1_8, ACC_PUBLIC | ACC_INTERFACE, name, "java/lang/Object", null, null);
cw.visitMethod(ACC_PUBLIC, "map", "(Ljava/lang/String;)Ljava/lang/String;", null, null);
cw.visitEnd();
byte[] bytes = cw.toByteArray();
// Definer with a single class
ClassDefiner cd = new ClassDefiner(sourceName, bytes);
// Definer with a map of entries
ClassDefiner cd = new ClassDefiner(Map.of(sourceName, bytes));
// Loading the class
Class<?> cls = definer.findClass(sourceName);
Miscellaneous
Developer articles that don't really fit anywhere else.
How to improve test cases
Tests serve a variety of purposes.
- Validate a feature does what it is supposed to.
- Validate changes to a feature do not break its expected supported usage.
But there are some additional things you can write in your test cases that may be useful.
- Validate a feature fails gracefully when given invalid inputs.
- Validate a feature works, even when given as many edge-cases as possible, as opposed to "common" input data.
How can you quickly tell what circumstances are covered by test cases though? The obvious option is to read the test cases and figure that out yourself. Ideally, the names of tests are descriptive enough to limit the amount of required reading for that. But there are additional options such as looking at test coverage and checking for branches that are not covered by tests.
Checking code coverage
You can check what branches in the source are covered by running the tests via gradlew test
and then build the coverage report via gradlew buildJacocoAggregate
. All tests are configured to log coverage data via JaCoCo, and the buildJacocoAggregate
task generates a report consolidating coverage from all tests in all modules into a single report. You can access the report in ./build/aggregate/
.
The main index page in the HTML report is a table showing which packages have the best code coverage. You can click on the packages to get data on each class in the package.
Clicking on a class shows per-method coverage.
Clicking on a method finally shows you the actual class source code, with information on coverage showed as line indicators. Clicking/hovering on them will reveal information like "1 out of 2 branches covered".
Naturally the more code that is covered the better. So using these reports to figure out where coverage is missing from really helps.
Why not use the built-in IntelliJ code-coverage feature when running unit tests?
Unfortunately, IntelliJ's code coverage support refuses to run if you attempt to run tests across multiple modules. Its been an open ticket since 2022 with no observed movement. You can use it if you test one module like recaf-core
then another such as recaf-ui
one after another, but you cannot combine those results together. If you just want to improve coverage over the recaf-core
module then this is not an issue and you can use it to get accurate coverage data shown inline in the IDE.
Checking code coverage online
If you want to see the current code coverage statistics and color-coded source without running the tests locally you can check out the latest codecov.io/Col-E/Recaf report. Do note that the way that CodeCov measures is slightly different than how JaCoCo measures coverage so the numbers may not match, but should generally be in the same ballpark.
Configuring annotations in IntelliJ
To cut down on the number of problems with NullPointerException
nullability annotations are heavily used in Recaf's source. With an IDE's support this can catch problems early on.
You'll want to make it so that @Nonnull
and @Nullable
are recognized by IntelliJ. The settings you'll want to change are highlighted here.
This first image makes it so that method overrides/implementations with missing annotations from their parent types show a warning, and allow you to automatically insert these missing annotations. Plus, when you use IntelliJ to automatically implement methods from a abstract class
or interface
these interfaces will be copied for you.
This second image makes it so the Jakarta Nonnull
and Nullable
annotations used in the project are recognized by IntelliJ. This allows us to have a basic system for tracking nullability. You'll also want to ensure the default annotation is the Jakarta one after you add it to the list.
Methods that can return null
or parameters that may be null
should be marked with Nullable
. Any time one of these is used, it will warn you when you do not have an appropriate null
check on a value.
Similarly, any method/parameter that should not be null
should be marked with Nonnull
. We use a plugin that generates null
checks at compile time, so the contract for non-null is enforced at runtime.
Faster builds in IntelliJ
Open your IntelliJ settings once the project is open and navigate to Build, Execution, Deployment | Build Tools | Gradle
. Change the "using" options to IDEA instead of Gradle.
You will need to do gradlew build
at least once before doing this to create a few files created by the build script.