Skip to content

Latest commit

 

History

History

opensearch-serverless

Ingesting data into a collection using Amazon OpenSearch Ingestion

osis-collection-pipeline

This is an Amazon OpenSearch ingestion project for CDK development with Python.

This project builds on the following tutorial: Ingesting data into a collection using Amazon OpenSearch Ingestion.

This project shows you how to use Amazon OpenSearch Ingestion to configure a simple pipeline and ingest data into an Amazon OpenSearch Serverless collection.

The cdk.json file tells the CDK Toolkit how to execute your app.

This project is set up like a standard Python project. The initialization process also creates a virtualenv within this project, stored under the .venv directory. To create the virtualenv it assumes that there is a python3 (or python for Windows) executable in your path with access to the venv package. If for any reason the automatic creation of the virtualenv fails, you can create the virtualenv manually.

To manually create a virtualenv on MacOS and Linux:

$ python3 -m venv .venv

After the init process completes and the virtualenv is created, you can use the following step to activate your virtualenv.

$ source .venv/bin/activate

If you are a Windows platform, you would activate the virtualenv like this:

% .venv\Scripts\activate.bat

Once the virtualenv is activated, you can install the required dependencies.

(.venv) $ pip install -r requirements.txt

At this point you can now synthesize the CloudFormation template for this code.

(.venv) $ export CDK_DEFAULT_ACCOUNT=$(aws sts get-caller-identity --query Account --output text)
(.venv) $ export CDK_DEFAULT_REGION=$(aws configure get region)
(.venv) $ cdk synth -c iam_user_name=your-iam-user-name --all

A note about Service-Linked Role

Some cluster configurations (e.g VPC access) require the existence of the AWSServiceRoleForAmazonOpenSearchServerless Service-Linked Role.

When performing such operations via the AWS Console, this SLR is created automatically when needed. However, this is not the behavior when using CloudFormation. If an SLR(Service-Linked Role) is needed, but doesn’t exist, you will encounter a failure message simlar to:

Before you can proceed, you must enable a service-linked role to give Amazon OpenSearch Service...

To resolve this, you need to create the SLR. We recommend using the AWS CLI:

aws iam create-service-linked-role --aws-service-name observability.aoss.amazonaws.com

OpenSearch Ingestion uses the service-linked role named AWSServiceRoleForAmazonOpenSearchIngestion. The attached policy provides the permissions necessary for the role to create a virtual private cloud (VPC) between your account and OpenSearch Ingestion, and to publish CloudWatch metrics to your account.

So you need to create the SLR. We recommend using the AWS CLI:

aws iam create-service-linked-role --aws-service-name osis.amazon.com

ℹ️ For more information, see here.

Required IAM permission for access to Amazon OpenSearch Serverelss

⚠️ Amazon OpenSearch Serverless requires mandatory IAM permission for access to resources. You are required to add these two IAM permissions for your OpenSearch Serverless "aoss:APIAccessAll" for Data Plane API access, and "aoss:DashboardsAccessAll" for Dashboards access. Failure to add the two new IAM permissions will result in 403 errors starting on May 10th, 2023

For a sample data-plane policy here:

Deploy

Use cdk deploy command to create the stack shown above.

(.venv) $ cdk deploy -c iam_user_name=your-iam-user-name --all

To add additional dependencies, for example other CDK libraries, just add them to your setup.py file and rerun the pip install -r requirements.txt command.

Clean Up

Delete the CloudFormation stack by running the below command.

(.venv) $ cdk destroy -c iam_user_name=your-iam-user-name --force --all

Useful commands

  • cdk ls list all stacks in the app
  • cdk synth emits the synthesized CloudFormation template
  • cdk deploy deploy this stack to your default AWS account/region
  • cdk diff compare deployed stack with current state
  • cdk docs open CDK documentation

Enjoy!

Run Tests

Step 1: Ingest some sample data

First, get the ingestion URL from the Pipeline settings page:

osis-pipeline-settings

Then, ingest some sample data. The following sample request uses awscurl to send a single log file to the my_logs index:

$ awscurl --service osis --region us-east-1 \
  -X POST \
  -H "Content-Type: application/json" \
  -d '[{"time":"2014-08-11T11:40:13+00:00","remote_addr":"122.226.223.69","status":"404","req
uest":"GET http://www.k2proxy.com//hello.html HTTP/1.1","http_user_agent":"Mozilla/4.0 (compatible; WOW64; SLCC2;)"}]' \
https://{pipeline-endpoint}.us-east-1.osis.amazonaws.com/log-pipeline/test_ingestion_path

You should see a 200 OK response.

Step 2: Query the sample data

Now, query the my_logs index to ensure that the log entry was successfully ingested:

$ awscurl --service aoss --region us-east-1 \
     -X GET \
     https://{collection-id}.us-east-1.aoss.amazonaws.com/my_logs/_search | jq -r '.'

Sample response:

{
  "took": 367,
  "timed_out": false,
  "_shards": {
    "total": 0,
    "successful": 0,
    "skipped": 0,
    "failed": 0
  },
  "hits": {
    "total": {
      "value": 1,
      "relation": "eq"
    },
    "max_score": 1,
    "hits": [
      {
        "_index": "my_logs",
        "_id": "1%3A0%3ALkidTIgBbiu_ytx_zXnH",
        "_score": 1,
        "_source": {
          "time": "2014-08-11T11:40:13+00:00",
          "remote_addr": "122.226.223.69",
          "status": "404",
          "request": "GET http://www.k2proxy.com//hello.html HTTP/1.1",
          "http_user_agent": "Mozilla/4.0 (compatible; WOW64; SLCC2;)",
          "@timestamp": "2023-05-24T07:16:29.708Z"
        }
      }
    ]
  }
}

References