TheHive 4.1.16 is out!

TheHive 4.1.16 is out!

TheHive 4.1.16 is out!
Source: https://www.monkeyuser.com/

TheHive 4.1.16 has just been released. It contains an important bug fix along with a few improvements.

  • First of all, we removed an unnecessary dependency with Log4j even if the version referenced was not vulnerable. We also made sure the logging framework we use – Logback – is safe, and updated appropriately following their latest announcement;
  • The Similar Cases view has been improved: filters are not persistent any more from one Alert to another;
  • This version of TheHive fixes the updatedAt and updatedBy fields of Alerts in search results. Search query results were displayed but without updatedAt and updatedBy fields.

Handling immense terms

TheHive v4.1.16 also fixes an issue related to large size data that could be found in titles for Alerts, Cases and Tasks, or Observables data values. Refer to the issue to get detailed information on impacted data fields.

Long story short; the indexing engine cannot accept data larger than 32KB. So, if TheHive ingests anything larger than 32KB, it will get properly committed to the database, but not be indexed. The complete value cannot be found nor displayed by the application afterwards. The worst is not over yet; reindexing operations will silently fail and prevent the application from starting correctly.

To deal with this limitation, the data model has been enhanced along with the search and compute similarities functions.

We highly recommend to everyone to update to version 4.1.16.

Troubleshooting this update

If you run TheHive v4.1.15, then you can update to TheHive v4.1.16; the update should apply smoothly.

If you run a version older then TheHive v4.1.15, the direct update to version 4.1.16 starts with a full data reindexing which can take some time depending on your database size and compute capacity.

Once the update is complete, there are two possible outcomes:

  1. If TheHive v4.1.16 starts successfully following the update, you are good to go! Your database probably did not contain large data records.
  2. If TheHive stalls and the logs keep showing messages like Reindex job XXX is running, then you are most likely facing an issue with large data records. One or more fields contain over 32KB of data.

Confirm the issue by looking at the logs

If you think you ended up in the second path, confirm it by looking for the following pattern in your logs (usually located in /var/log/thehive/application.log): Document contains at least one immense term in field. In such a case, the logs should look like this:

Caused by: org.janusgraph.diskstorage.PermanentBackendException: Permanent exception while executing backend operation IndexMutation
  at org.janusgraph.diskstorage.util.BackendOperation.executeDirect(BackendOperation.java:81)
  at org.janusgraph.diskstorage.util.BackendOperation.execute(BackendOperation.java:54)
  ... 51 common frames omitted
Caused by: java.lang.IllegalArgumentException: Document contains at least one immense term in field="title_____s" (whose UTF8 encoding is longer than the max length 32766), all of which were skipped.  Please correct the analyzer to not produce such terms.  The prefix of the first immense term is: '[76, 111, 114, 101, 109, 32, 105, 112, 115, 117, 109, 32, 100, 111, 108, 111, 114, 32, 115, 105, 116, 32, 97, 109, 101, 116, 44, 32, 99, 111]...', original message: bytes can be at most 32766 in length; got 50166

Example of log pattern highlighting large size fields

Fixing the issue

TheHive v.4.1.16 was designed to fix this issue automatically for all new ingested data. However, the problem for existing data requires that you follow this procedure very strictly:

1. Stop the TheHive application

service thehive stop

2. Install TheHive v4.1.16 if you have not already done so

3. Update the configuration file /etc/thehive/application.conf. Insert the following three temporary parameters at the end of the file and save it

## -----------------------------------------------------------------
## Temporary configuration to solve the immense terms indexing issue
## This will be using as a first step of TheHive database initialization

## This is used to truncate titles if you have immense titles issues
db.janusgraph.immenseTermProcessing.title = "truncate(1024)"

## This is used to fix text observables with big values
db.janusgraph.immenseTermProcessing.data  = "observableHashToIndex"

## This is required to rebuild the index
db.janusgraph.forceDropAndRebuildIndex: true
## -----------------------------------------------------------------

4. Restart TheHive

service thehive start

Upon restart, TheHive will browse the entire database to search and:

1. Truncate titles if you have immense titles issues. Titles longer than 32KB are shortened to the first 1024 bytes;
2. Fix text observables with big values. The data is kept intact.

Then, TheHive will rebuild the index.

Overall these operations are expected to take some time, depending on your database size and compute capacity.

You can follow the progress by tailing the logs (tail -f /var/log/thehive/application.log).

Once these operations are complete, the application will be available again. At this stage, do not use the application and stop it.

service thehive stop

5. Edit the configuration file again to remove the three temporary parameters inserted earlier and save it. Restart thehive.

service thehive start

The application should now start successfully and become available for use in nominal mode.

As usual, we strongly recommend testing updates on acceptance systems before applying them to your production environment.

Note that if you are running a version older than v4.1.15 and plan to update to v4.1.16, it is not useful to migrate to v.4.1.15 first as the reindexing will occur anyway and might get stuck if you have large data records.

Running Into Trouble?

If you are a customer of StrangeBee support services please contact the helpdesk.

Shall you encounter any difficulty, please join the community on Discord! We will be more than happy to help!