Help us improve Softanics
We use analytics cookies to understand which pages and downloads are useful. No ads. Privacy Policy
Artem Razin
Low-level software protection engineer with 20+ years in native and managed code security. Creator of ArmDot, protecting commercial .NET applications since 2014.

Trial Version Protection in .NET: Stopping the Bypass

Every ISV developer who ships a trial version discovers the same thing eventually. They check their analytics, find a forum thread, or get a support request that makes no sense, and they realize: people are running the trial indefinitely. Resetting the trial period by deleting a registry key. Rolling the system clock back. Reinstalling the application. Using a tool specifically designed to extend trial limits.

The trial conversion rate that was supposed to reflect genuine purchase interest instead reflects an unknown mix of people who tried and decided not to buy, and people who tried and never needed to buy because they found a way around the limit.

The attacks on trial protection are narrower and more predictable than attacks on serial key schemes. An attacker is not trying to forge cryptographic credentials - they are trying to manipulate the conditions the trial logic observes. That makes the threat model concrete and the defenses specific.

System clock manipulation

If the trial uses DateTime.Now to check whether the trial period has elapsed, setting the system clock back to before the trial started defeats it. Any implementation that relies solely on the wall clock is trivially bypassed.

The counter: record the installation timestamp in a location the user cannot easily modify, and compare not just against the current time but against evidence of time passage. The modification timestamp of a file written during installation, the creation time of registry entries the user has no reason to touch, or the last-run timestamp stored in an encrypted location all provide secondary evidence. If the current date is earlier than the last recorded run date, someone has moved the clock backward.

Registry key deletion

Any trial that stores its state in a standard, predictable registry path under HKCU or HKLM is vulnerable to deletion. The attacker removes the key, relaunches the application, and the trial resets to day one.

The counter: use storage locations the attacker cannot easily find and clear. Obfuscated paths, encrypted values, multiple redundant storage locations that cross-verify each other. If one storage location is missing but others are intact, the trial has been tampered with. Storing state in locations that require elevated permissions to modify adds friction, though it does not stop a determined attacker with administrator access.

One approach I have seen in practice: encrypt the application's settings using a key derived from the stored installation date. If someone tampers with the installation date to reset the trial, the encryption key changes and all saved settings become unrecoverable. The application can also compare the current date against the stored installation date - if the current date is earlier, the clock has been rolled back. This creates a natural consequence for tampering: not just a trial reset failure, but the loss of all accumulated user configuration.

Application reinstallation

Any trial that stores state exclusively in user-removable locations - AppData folders, registry entries under the current user, or any path wiped on reinstall - resets when the application is reinstalled.

The counter is machine-level binding rather than installation-level binding. A hardware fingerprint ties the trial to the physical machine rather than the installation instance. A reinstall on the same machine produces the same fingerprint and the trial state is recognized as existing. This is the same hardware ID approach described in the hardware ID locking guide, applied to trial tracking rather than license binding.

Virtual machine snapshot rollback

A more sophisticated version of the reinstallation attack: spin up a VM, activate the trial, take a snapshot, use the software, then restore the snapshot to reset to the trial-start state. This is more common in enterprise evaluation contexts than casual piracy, but it is a real vector.

The counter combines VM detection with hardware ID binding that accounts for VM-specific hardware identifiers. The goal is not to block VM use entirely - many legitimate customers run software in virtualized environments - but to detect when the trial state has been rolled back by comparing stored timestamps against the evidence of time passage described above.

Binary patching of the expiry check

This is the attack that connects trial protection directly to obfuscation, and it is the one that makes all the storage hardening above insufficient on its own.

An attacker opens the binary in dnSpy, finds the method that returns true when the trial has expired, and patches the return value to always return false. The trial logic can be as sophisticated as desired - redundant storage, clock manipulation detection, hardware fingerprinting - none of it matters if the final expiry check is a readable conditional branch that can be flipped in seconds.

This is the same attack described in the serial key generation guide for asymmetric key validation: the attacker does not defeat the mechanism, they bypass the check entirely.

The defense is the same: control flow obfuscation to make the expiry check difficult to locate, code virtualization on the evaluation method to make it impossible to patch through static analysis, and integrity checking to detect and respond to binary modification.

Integrity checking as a tamper response

ArmDot's [IntegrityChecking] attribute detects binary modification at runtime. If the protected assembly has been altered after obfuscation - a patched branch, a modified return value, a replaced method body - the integrity check fails, the application displays a message, and exits.

[assembly: ArmDot.Client.IntegrityChecking(
    Text = "This application has been modified and cannot run.",
    Caption = "Integrity Error")]

This directly addresses the binary patching attack. An attacker who patches the trial expiry check also invalidates the integrity signature. The patch that was supposed to bypass the trial instead triggers a tamper response.

Integrity checking is currently available on Windows.

ArmDot's trial implementation

ArmDot implements trial protection through its licensing system rather than a separate trial mechanism. A trial is a time-limited license key with an embedded expiration date:

[ArmDot.Client.VirtualizeCode]
private bool IsTrialValid(string trialKey)
{
    ArmDot.Client.Api.PutKey(trialKey);
 
    switch (ArmDot.Client.Api.GetLicenseState())
    {
        case ArmDot.Client.Api.LicenseKeyState.Valid:
            return true; // Trial still active
        case ArmDot.Client.Api.LicenseKeyState.Expired:
            return false; // Trial period ended
        default:
            return false;
    }
}

The expiration date is embedded in the key using RSA asymmetric encryption - it cannot be modified without invalidating the cryptographic signature. An attacker cannot extend the trial by editing the key. The only remaining attack is bypassing the validation call itself, which is where [VirtualizeCode] and [IntegrityChecking] apply.

ArmDot keys also support MaximumBuildDate, which serves a different purpose: limiting how long a license key works with newer builds. If you sell annual upgrade subscriptions, set the maximum build date to one year after purchase. Each obfuscated build records its build date automatically. A customer whose subscription has lapsed can continue running the version they have but cannot use builds produced after their maximum build date. ArmDot uses this model for its own licensing.

The layered defense

No single mechanism prevents all trial bypass attacks. The robust approach combines multiple layers:

Storage hardening (redundant, encrypted, cross-verified timestamps) stops casual reset attempts. Hardware ID binding stops reinstallation-based resets. Cryptographically signed expiration dates stop key modification. Code virtualization on the trial check stops binary patching. Integrity checking detects and responds to any modification that slips through.

Each layer addresses a specific attack vector. Together they make trial bypass require significantly more effort than the software is worth to most attackers - which is the same economic deterrence argument that applies to obfuscation generally.

Related: Hardware ID Locking in .NET → - binding the trial to the machine rather than the installation.

Related: Code Virtualization for .NET → - protecting the trial expiry check from binary patching.

Back to: .NET Licensing Protection →

ArmDot licensing

ArmDot provides trial protection through time-limited license keys with RSA-signed expiration dates, integrity checking for tamper detection, and code virtualization for the validation logic. The licensing API is part of the same ArmDot.Client NuGet package. Free trial available - protected assemblies stop working after two weeks.