Looking for a WordPress and AWS Expert for Consulting

Keywords: WordPress + NGINX + SSL - AWS - How to - Other

Description:
We need to set up autoscalling, AWS alarms for monitoring, and best determine proper sizing. Looking for a consultant to work with us review and train us on how to maintain this new WP system on AWS. Must have the AWS side of the knowledge.

Hi @JHWebb,

thank you for your interest. We do not provide that kind of support. However, we do provide one WordPress solution that includes autoscalling features.

https://aws.amazon.com/marketplace/pp/prodview-d3wxyyhusm6aq
https://docs.bitnami.com/aws-templates/apps/wordpress-production-ready/

You can review that solution and verify if it meets your requirements.

Thanks

Hi,
If you are still on the lookout for some help I would be glad to help you out, as what you are wanting done is well within my area of expertise.
You can reach me on andrewjohnson.56782@gmail.com
Cheers and Have a great day ahead,
Andrew

@JHWebb plz go through this link to understand how to autoscale ur instance

https://docs.aws.amazon.com/codedeploy/latest/userguide/tutorials-auto-scaling-group-create-auto-scaling-group.html

and for setting up alarms go through this

https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/using-cloudwatch-createalarm.html

i hope u find it usefull if u need further help cna reply back .
or u can use aws dev support

@JHWebb for proper sizing u need to determine ur audience, site traffic and scalabilty for future use .
u can revert back to me on ur need and i will definately help as much i can .

Hi. Thanks for the response. Agree with your statement. However, we are needing to scale this system and want to make sure we are not missing anything. Additionally, we are adding additional functional capabilities and need to integrate elastic search with S3 to enable document management. this is why we are looking for a consultant to help with the growth and scale of our WP set up in AWS. Must have both WP and AWS experience.

Hello @JHWebb

Is this position is still open?

yup @seth2 u should provide us your audience size , ur tartegting audience size ,products units u like to display want cdn or not .type of enterpise size so we cn assist u further

further you can use the below for integrating elastic seacrch with s3 using amazon lambda

Using AWS Lambda to Connect S3 to Elasticsearch

To load the data from S3 to ElasticSearch, you can use Amazon Lambda to create a trigger that will load the data continuously from S3 to ElasticSearch. The Lambda will watch the S3 location for the file, and in an event, it will trigger the code that will index your file.

The process of loading data from Amazon S3 to Elasticsearch with AWS Lambda is very straightforward. The following steps are required to connect S3 to Elasticsearch using this method:

Step 1: Create a Lambda Deployment Package

  • Open your favorite Python editor and create a package called s3ToES.
  • Create a python file named “s3ToES.py” and add the following lines of code. Edit the region and host on the following code.

Import Libraries:

import boto3
import re
import requests
from requests_aws4auth import AWS4Auth

Define Constants:

region = 'us-west-1'
service = 'es'
creds = boto3.Session().get_credentials()
awsauth = AWS4Auth(creds.access_key, creds.secret_key, region, service, session_token=creds.token)
host = 'http://aws.xxxxxxxxxxx.com/es'
index = 'lambda-s3-file-index'
type = 'lambda-type'
url = host + "/" + index + "/" + type
headers = { "Content-Type": "application/json" }
s3 = boto3.client('s3')
  • Create Regular Expressions to parse the logs.
pattern_ip = re.compile('(d+.d+.d+.d+)')
pattern_time = re.compile("[(d+/www/dddd:dd:dd:dds-dddd)]")
pattern_msg = re.compile('"(.+)"')
  • Define Lambda handler Function.
def praseLog(event, context):
    for record in event["Records"]:
   # From the event record, get the bucket name and key.
        bucket_name = record['s3']['bucket']['name']
        file_key = record['s3']['object']['key']
        # From the S3 object, read the file and spllit the lines
        obj = s3.get_object(Bucket=bucket_name, Key=file_key)
        body = obj['Body'].read()
        lines = body.splitlines()
        # For each line, match the regex.
        for line in lines:
            ip = pattern_ip.search(line).group(1)
            timestamp = pattern_time.search(line).group(1)
            message = pattern_msg .search(line).group(1)
            parsed_doc = { "ip": ip, "timestamp": timestamp, "message": message }
            r = requests.post(url, auth=awsauth, json=parsed_doc, headers=headers)
  • Install the Python packages to the folder where the code resides.

Windows :

cd s3ToES
pip install requests -t .
pip install requests_aws4auth -t .

Linux:

cd s3ToES
pip install requests -t .
pip install requests_aws4auth -t .
  1. Package the code and the dependencies –

Windows:

Right-click on the s3ToES folder and create a zip package

Linux:

zip -r lambda.zip *

Step 2: Create the Lambda Function

Once you successfully created the deployment package, you need to create a Lambda function to deploy the package designed above.

  • Search for the AWS Lambda in the AWS Console, and then click on Create Function.

  • Once you create the function, you need to add a trigger that will invoke the task when the event happens. In this example, we want the code to run whenever a log file arrives in an S3 bucket. Follow the below steps to create trigger.
  1. Choose S3.
  2. Choose your bucket.
  3. For Event Type, choose PUT.
  4. For Prefix, type logs/.
  5. For Filter pattern, type .log.
  6. Select Enable trigger.
  7. Choose Add.
  • For Handler, type s3ToES.parseLog. This setting will tell Lambda that the file is – s3ToES.py and the method to invoke after the trigger is – praseLog.

  • Select the zip file as the code entry type and upload the zip file created above.

  • Choose Save.

Once you save, the Lambda function will be ready for its execution.

Step 3: Test the Lambda Function

  • To test the Lambda function, you need to upload a file to the S3 location.

  • Create a file named sample.log and add the following log properties –

12.345.678.90 - [23/Aug/2020:13:55:36 -0700] "PUT /some-file.jpg"
12.345.678.91 - [23/Aug/2030:14:56:14 -0700] "GET /some-file.jpg"
  • Once you upload the file, the Lambda will invoke the ES function and ES will index the log.

  • Go to the ElasticSearch Console or Kibana and verify that the ‘lambda-s3-file-index’ index contains two documents.

GET https://es-domain/lambda-index/_search?pretty
{
  "hits" : {
    "total" : 2,
    "max_score" : 1.0,
    "hits" : [
      {
        "_index" : "lambda-s3-file-index",
        "_type" : "lambda-type",
        "_id" : "vTYXaOWEKWV_TTkweuSDg",
        "_score" : 1.0,
        "_source" : {
          "ip" : "12.345.678.91",
          "message" : "GET /some-file.jpg",
          "timestamp" : "23/Aug/2020:14:56:14 -0700"
        }
      },
      {
        "_index" : "lambda-s3-index",
        "_type" : "lambda-type",
        "_id" : "vjYmaWIBJfd_TTkEuCAB",
        "_score" : 1.0,
        "_source" : {
          "ip" : "12.345.678.90",
          "message" : "PUT /some-file.jpg",
          "timestamp" : "23/Aug/2020:13:55:36 -0700"
        }
      }
    ]
  }
}

yes, please reach out and we can share info privately.

Okay.
please help me with your contact details

jesse period(.) webb @ avalonhcs.com