Codename: Applesauce Week One (aka Tweeting w/ Lambda)

By | 23 December, 2016

First week of progress. Started small and have continuously added/improved. At this point I have a Lambda function in Python that takes the input of action and project then append the time (hard-coded to be EST). I think I need to simplify back to the original thought of just doing start and stop time, though other input is good for testing (see here).

Without further ado, here’s the progress made so far.

I started out with this simple tutorial on getting set up to tweet via a Python script. In a relatively shot period of time I was able to do my first tweet through a Python script. Good first step! Next step was to have it run as a Lambda function. I initially found this repository that looked promising. I really liked the idea of a KMS key and the ability to have the credentials encrypted. I started going down the route of integrating KMS. I created a new key, and found some good info here.

I created a policy for an IAM user and tested it out.

The policy is as follows:

	"Version": "2012-10-17",
	"Statement": [{
		"Sid": "Stmt1481667200000",
		"Effect": "Allow",
		"Action": [
		"Resource": [

Then I tested with:

aws kms encrypt --key-id --plaintext fileb://lulzy.json --output text --query CiphertextBlob --profile applesauce --region us-east-1 | base64 --decode > YouCantReadMe

That all worked as expected… Lets try it all in a Lambda now! Trying to get all the dependancies good. Check here:

Not coming from a Python background, but wanting to learn more, I was getting stuck with the zip file creation. Having a single script run was fine, but certain imports, dependencies, and config files would require a zip. This site helped out with a basic bash script to create that.

I modified both the bash script and to do my bidding. The twitter credentials are in clear text which is a no-no, but fine for testing, however next step is to get them encrypted using KMS. As for now, though, here is what I have on the bash script side:


# blindly clean up from previous executions if they exist
rm -rf chirps_mcgee
rm -f
# create a virtual environnment named "chirps_mcgee"
virtualenv chirps_mcgee
# activate the virtual environnment
source chirps_mcgee/bin/activate
# install/upgrade pip
pip install --upgrade pip
pip install --upgrade tweepy pytz
# add our python lambda handler code to the lambda zip archive
zip -9
zip -r9 secrets
# add all the contents of site-packages to the zip archive
cd $VIRTUAL_ENV/lib/python2.7/site-packages
zip -r9 $DIR/ *
cd $VIRTUAL_ENV/lib64/python2.7/site-packages
zip -r9 $DIR/ *

# file:

import tweepy
from datetime import datetime
from pytz import timezone
import boto3
from base64 import b64decode

#used for local testing
#session = boto3.Session(profile_name='applesauce',region_name='us-east-1')

session = boto3.Session(region_name='us-east-1')
kms = session.client('kms')

def get_api(cfg):
auth = tweepy.OAuthHandler(cfg['consumer_key'], cfg['consumer_secret'])
auth.set_access_token(cfg['access_token'], cfg['access_token_secret'])
return tweepy.API(auth)

def decrypt_data(encrypted_data):
decrypted = kms.decrypt(CiphertextBlob=encrypted_data)
return decrypted['Plaintext']

def lambda_handler(event, context):

# Fill in the values noted in previous step here
with open('secrets') as f:
content = f.readlines()

cfg = {
"consumer_key" : decrypt_data(b64decode(content[0].strip("\n"))),
"consumer_secret" : decrypt_data(b64decode(content[1].strip("\n"))),
"access_token" : decrypt_data(b64decode(content[2].strip("\n"))),
"access_token_secret" : decrypt_data(b64decode(content[3].strip("\n")))

api = get_api(cfg)

detroit = timezone('America/Detroit')
det_time =

tweet = event['Tweet'] + ' #' + event['Project'] + ' @ ' + det_time.strftime('%H:%M')
print tweet
status = api.update_status(status=tweet)
# Yes, tweet is called 'status' rather confusing

if __name__ == "__main__":

This was all working well, but I then wanted to update directly from terminal rather than using the management console every time, so I added the AWSLambdaFullAccess policy to my project IAM user, and executed the following:

aws lambda update-function-code --zip-file fileb:///Users/kmcauliffe/codename_applesauce/new_attempy/ --function-name lambda_function --region us-east-2 --profile applesauce

Last step for tonight is to invoke via aws cli. This was done with:

aws lambda invoke --invocation-type RequestResponse --function-name lambda_function --region us-east-2 --log-type Tail --payload '{"Tweet": "Finished", "Project": "CodenameApplesauce"}' --profile applesauce output.txt

The next thing to do to finish out the week was KMS encryption. I did that using another Lambda function,

# file:

import boto3
from base64 import b64encode

session = boto3.Session(region_name='us-east-1')
kms = session.client('kms')
boto_master_key_id = 'cc0a7795-7391-4f3a-9919-ae8d322e38cb'

def encrypt_data(key,data):
  encrypted = kms.encrypt(KeyId=key, Plaintext=data)
  print('Encrypted Encoded Text: ',base64.b64encode(encrypted['CiphertextBlob']))
  Encrpyted_Data = b64encode(encrypted['CiphertextBlob'])
  return Encrpyted_Data

def lambda_handler(event, context):
  encrypt_data(boto_master_key_id, event['key_to_encrypt'])

if __name__ == "__main__":

The biggest issue I ran into was forgetting to base64 encode the encrypted credentials which made Python cry. I added this and all worked as expected. As of now I am just storing these in a file that gets read on the lambda function. I’ll have to do something better with this in the future.

All in all, a successful first week. Forcing myself to use Python over bash has been fulfilling. As for where to go next… I would like to start storing data in DynamoDB, but will hold off on that until I am done with the course I am currently taking on A Cloud Curu.

That being the case, I’ll likely try to create an Alexa skill so that I can trigger this function by talking to my Alexas. More to come on that over the next week!

Leave a Reply