Skip to content
Last9 Last9

Remapping

Transform and standardize your logs data by extracting and mapping fields for better searchability and analysis.

Remapping allows you to standardize your logs data by extracting fields from log lines and mapping them to consistent formats. This powerful feature helps you normalize data across different services and sources, making your logs more searchable and easier to analyze.

Overview

Remapping consists of two primary functions:

  1. Extract: Pull specific fields or patterns from your log lines
  2. Map: Transform extracted fields into standardized formats

This capability is valuable for scenarios like:

  • Normalizing different service names across your infrastructure
  • Standardizing severity levels from various sources (ERROR, err, Fatal, 500)
  • Creating consistent environment labels (prod, production, prd)
  • Extracting structured data from JSON or pattern-based logs
  • Maintaining consistent field naming conventions

Working with Remapping

Extract

Control Plane — New Drop Rule

  1. Navigate to Control Plane > Remapping
  2. Select the “Extract” tab
  3. View existing extraction rules in the table showing:
    • Name: Descriptive name of the extraction rule
    • Method: JSON or Pattern Match extraction method
    • Scope: Which lines the extraction applies to
    • Fields/Pattern: Which fields or patterns to extract
    • Action: How the extracted data is handled (Upsert/Insert)
    • Active Since: When the rule was activated
  4. Click ”+ NEW RULE” to create a new extraction rule

Creating a New Extraction Rule

  1. Select “Extraction Method”:

    • JSON: Extract fields from structured JSON logs
    • Pattern Match: Use regex patterns to extract fields from unstructured logs
  2. Choose “Extraction Scope”:

    • All Lines: Apply extraction to every log line
    • Lines that match: Apply only to lines matching specific criteria
  3. Field(s) to Extract:

    1. For JSON method:

      • Select the field(s) to extract
      • Example fields: requestId, thread_id, logger_name, etc.
    2. For Pattern Match method:

      • Enter the regex pattern in “Pattern to Extract” field
      • Example: timeseries:\s*(?P<timeseries>\d+)
  4. Set “Action” to “Upsert” (update if exists, insert if not) or “Insert”

  5. Choose “Extract Into” option:

    • Log Attributes: Adds fields to the log’s searchable attributes
    • Resource Attributes: Adds fields to the resource’s metadata
  6. Optionally add a “Prefix” to extracted field names

    • Example: “ec2_” would transform “id” to “ec2_id”
  7. Enter a descriptive “Rule Name”

  8. Click “SAVE” to activate the rule

Map

Control Plane — New Drop Rule

  1. Navigate to Control Plane > Remapping

  2. Select the “Map” tab

  3. View “Remap Fields” section with existing mappings

  4. Map common fields to standardized formats:

    • Service: Map various service names to consistent values
      • Example: attributes["service_name"]
    • Severity: Map different log levels to standard severity
      • Example: attributes["level"] and attributes["levelname"]
    • Deployment Environment: Map environment indicators
      • Select from available attributes
  5. Preview the mapping results in the “Preview (Last 2 mins)” section below

    • SERVICE: How service names appear after mapping
    • SEVERITY: Standardized severity levels
    • DEPLOYMENT ENV: Normalized environment names
    • LOG ATTRIBUTES: Other log details
    • RESOURCE ATTR: Resource-related information
  6. After configuring mappings, click “SAVE”

Example Use Cases

  1. Standardizing Service Names: Map various service identifiers to consistent names

    • Raw values: “auth-svc”, “auth_service”, “authentication”
    • Mapped to: “authentication-service”
  2. Normalizing Severity Levels: Create consistent severity levels across sources

    • Raw values: “ERROR”, “err”, “Fatal”, “500”
    • Mapped to: “ERROR”
  3. Extracting Thread Information: Pull thread details from logs for better filtering

    • Extract fields: thread_id, thread_name, thread_priority
    • Makes thread-based troubleshooting more efficient
  4. Environment Consistency: Standardize environment naming

    • Raw values: “dev”, “development”, “preprod”, “staging”
    • Mapped to consistent environment names

Tips for Effective Remapping

  • Start Simple: Begin with the most common fields you search by
  • Use Consistent Naming: Follow a naming convention for all mapped fields
  • Check Preview Results: Use the preview section to verify your mappings work as expected
  • Consider Extraction Order: Remember that attributes will be looked up in the sequence they are entered
  • Use JSON When Possible: JSON extraction is more reliable for structured logs
  • Test Pattern Matches: Validate regex patterns before implementing them

Troubleshooting

If your remapping rules aren’t working as expected:

  1. Check the extraction pattern syntax for errors
  2. Verify field names match exactly what appears in your logs
  3. Ensure your extraction scope is appropriate
  4. Look at the preview to confirm data is flowing as expected
  5. Try simplifying complex regex patterns

Please get in touch with us on Discord or Email if you have any questions.