Accessing PostgreSQL data from AWS Lambda

All databases need some maintenance that we usually perform using scheduled tasks. For example, if you have an RDS instance and you want to get a bloat report once a month, you’ll probably need a small EC2 instance just to do these kinds of things. In this post, we will talk about accessing RDS, getting a result from a SQL statement and reporting it to a slack channel using a scheduled Lambda function; let’s call this poor man’s crontab :)

Before we start, psycopg2 isn’t supported by lambda so it needs to be packaged into the lambda deployment package along with any other dependencies, but since psycopg2 requires libpq it needs to be compiled with libpq statically linked. There are many binaries of psycopg2 that you can download, but I would suggest you compile your own, using the latest PostgreSQL source code. We’ll talk about all these steps throughout this post.
We’ll cover the following:
  • Creating a new AWS postgres RDS instance
  • How to compile psycopg (we’ll use docker for that)
  • Code for the function
  • Packaging
  • Lambda function, trigger, schedule
  • Testing

I won’t get into much detail about the steps of building an RDS instance as it is pretty straight forward, assuming you have your aws client setup, you’ll need something similar to the following, if not, go to aws console and clickety-click.

aws rds create-db-instance  \
--db-subnet-group-name [your subnet group] \
--db-security-groups [your security group] \
--db-instance-identifier [your instance identifier] \
--db-instance-class db.t2.micro \
--engine postgres \
--allocated-storage 5 \
--no-publicly-accessible \
--db-name [your db name] \
--master-username [your username] \
--master-user-password [your password]\
--backup-retention-period 3
For compiling psycopg2 we need a linux installation, I am on a mac (don’t hate) so I used a docker container (ubuntu bionic) for the compiles. You can find the Dockerfile here

The steps for compiling are the following :
# Download sources :
wget -c
git clone
# Untar
tar zxfv postgresql-11.4.tar.gz
# Make a local install dir
mkdir ./pgsql-11.4
#configure, compile, install postgres
cd postgresql-11.4 ; ./configure --prefix=/home/postgres/pgsql-11.4 --without-readline --without-zlib
make ; make install
# Edit psycopg2  setup config
cd psycopg2 ;vim setup.cfg
# Change:
# Compile psycopg2
python3.6 build
# Package psycopg2
mv ~/psycopg2 ~/psycopg2_source
cp -r ~/psycopg2_source/build/lib.linux-x86_64-3.6/psycopg2/ ~/
tar cvfz psycopg2.tgz psycopg2/
# Finally, get the package from docker, I just used scp
scp psycopg2.tgz vasilis@

Next we need to write the Lambda function.
You can find my sample code, along with a binary version of psycopg2 for 11.4, and a deployment script in my repo here

Keep in mind that the filename ( and the function name (lambda_handler) are important as they will be defined later when we will create the function in lambda and that the handler has to have this syntax structure :
def handler_name(event, context): 
    return some_value  
For more information see AWS documentation here. The code is pretty simple, it will connect to an RDS instance, it’ll run a simple statement and it will report the result in slack, but it could be something different like aws SNS or your trigger event might be something completely different, like when a file is created in S3 do something in python and load it to postgres, lambda also supports API gateway. For more information about the different things you can do, see here.

Next, we need to package all these into a “lambda deployment package” AWS documentation can be found here. This is basically a zip that contains all dependencies and the code we want lambda to execute. In my case Slacker (and its dependencies) and the psycopg2 package we compiled earlier. Amazons documentation is pretty clear how to do this, so I won’t repeat the steps here but your zip file should look like this when unzipped:

> tree -L 1
├── bin
├── certifi
├── certifi-2019.6.16.dist-info
├── chardet
├── chardet-3.0.4.dist-info
├── idna
├── idna-2.8.dist-info
├── psycopg2
├── requests
├── requests-2.22.0.dist-info
├── slacker
├── slacker-0.13.0.dist-info
├── urllib3
├── urllib3-1.25.3.dist-info
Now that we have the deployment package ready, all we need to do is the lambda function and something that will trigger it on schedule. Go to aws console, hit lambda and create a new function:

Next screen, on the left side where triggers are, pick cloudwatch events, we’ll use these events to schedule how often the function will run, we also should put python 3.6 in runtime and in Handler we make sure that it matches the filename.hundler_function_name of our code.

Next screen you’ll have to create a new rule and write a schedule expression :

This schedule expression accepts ‘rate’ or a cron expression, see AWS documentation here.
 From the function main page in aws console, scroll down to network, add your VPC and last, At ‘Execution role’ add a role or create a new one like the following :

As you may have noticed, I haven't added any code yet. You can add your zip from the console :

I found it much easier to use a script like the following in order to make testing a bit less tedious. Keep in mind that the following script requires AWS CLI installed:

cp ./ ./
rm -f
cd packages ; zip -r9 /Users/vasilis/Work/working_on/projects/lambda/packaging/ .
cd $home
zip -g
zip -g
zip -g
aws lambda update-function-code --function-name pg_lambda --zip-file fileb://
sleep 2 
rm -f outfile ;aws lambda invoke --function-name "pg_lambda" outfile ;cat outfile
In this script, I repackage the deployment package, I update the function code on aws and last, I manually invoke the function, which should give me a result in slack. I found this way much easier to work with than making inline changes in AWS console. 
Lambda is cool.. kinda, and having a way to schedule simple postgres tasks without having a server, or an ec2 instance is nice, but this is by no means a replacement for crontab, simply because lambda has a lot of limitations. Say you wanted a pgbadger report sent everyday, you are out of luck. Perl is not supported in lambda. But if you want a bloat report or any SQL statement executed on a schedule, lambda could be a solution.

As mentioned before, dockerfile , scripts and a binary of psycopg2 (against pg11.4) can be found here.

Thanks for reading
Vasilis Ventirozos


  1. This comment has been removed by a blog administrator.

  2. Good Information
    Yaaron Studios is one of the rapidly growing editing studios in Hyderabad. We are the best Video Editing services in Hyderabad. We provides best graphic works like logo reveals, corporate presentation Etc. And also we gives the best Outdoor/Indoor shoots and Ad Making services.
    Best video editing services in Hyderabad,ameerpet
    Best Graphic Designing services in Hyderabad,ameerpet­
    Best Ad Making services in Hyderabad,ameerpet­

  3. Very informative blog and useful article thank you for sharing with us , keep posting learn more about aws with cloud computing

    AWS Training

    AWS Online Training

  4. I just loved your article on the beginners guide to starting a blog.If somebody take this blog article seriously in their life, he/she can earn his living by doing blogging.thank you for thizs article. top devops online training

  5. I think this is one of the most important info for me.And i am glad reading your article. But want to remark on few general things, The site style is good , the articles is really excellent and also check Our Profile for devops training and devops videos.

  6. Thank you for your valuable content , Easy to understand and follow. As said, the migration to cloud is very essential for the protection of the database.Cloud Migration Services

  7. Thanks for sharing !!Excellent blog with informative concept.
    DevOps Training
    DevOps Online Training


  8. Thanks for provide great informatic and looking beautiful blog, really nice required information & the things i never imagined and i would request, wright more blog and blog post like that for us. Thanks you once agian

    birth certificate in delhi

    name add in birth certificate

    birth certificate in gurgaon

    birth certificate correction

    birth certificate in noida

    birth certificate online

    birth certificate in ghaziabad

    birth certificate in india

    birth certificate apply online

    birth certificate in bengaluru

  9. Thanks for provide great informatic and looking beautiful blog, really nice required information & the things i never imagined and i would request, wright more blog and blog post like that for us. Thanks you once agian

    name change procedure in chandigarh
    name change procedure in delhi
    name change procedure gurgaon
    name change procedure in jaipur
    name change in pune
    name change online
    name change in india
    name change procedure in bangalore
    name change procedure in rajasthan
    name change procedure in maharashtra

  10. Usually I never comment on blogs but your article is so convincing that I never stop myself to say something about it. You’re doing a great job Man learn Python Online Course

  11. Nice Article...Very interesting to read this article. I have learned some new information.thanks for sharing.
    AWS Training Hyderabad/
    AWS Course Hyderabad/
    AWS Training Institutes Hyderabad/
    AWS Online Training Hyderabad/


Post a comment

Popular posts from this blog

Tuning checkpoints

AWS Aurora Postgres, not a great first impression