About Me

Hi! My name is Ram Vegiraju and I am currently a Solutions Architect at Amazon. I am passionate about understanding the theory behind Machine/Deep Learning algorithms and implementing full-stack Machine Learning solutions at scale for interesting problems in fields such as NLP and Computer Vision. In my free time I love writing, reading, and playing basketball or tennis. Read my latest article here. Feel free to check out my Resume/Skills below and some of my projects and hobbies in the other sections!

Education

University of Virginia

B.A. Degree in Statistics, Minor in Computer Science Sept 2017 - Dec 2020

Relevant Coursework: Machine Learning, Linear Algebra, Software Development Methods Java, Data Analysis with Python, Data Science with R, Mathematical Statistics & Probability, Regression Analysis

Work

Amazon

R&D Software Engineering Intern June 2020 - August 2020

Worked on building and presenting prototypes as an R&D Engineer over 12 weeks for two different customers.

  • Objective: Developed and deployed a full stack Neural Machine Translation prototype for client. Presented weekly demos to clients and gained extensive experience with various AWS services: Lambda, S3, API Gateway, CloudFormation, Cognito, Sagemaker, ECR, Amplify, and KMS.
  • React/Javascript: Utilized AWS Amplify to create a front-end dashboard with authentication through AWS Cognito. Application processes client inputs and accesses a REST API created through AWS API Gateway.
  • Python: Developed AWS Lambda functions that query and store inputs in AWS S3. Utilized AWS Python SDK (boto3) to connect to AWS Translate and AWS Sagemaker endpoint and return ouputs to the front-end while storing results in S3.
  • NLP/ML: Worked with NLTK, Spacy, and Transformers libraries to preprocess and tokenize languages prior to modelling. Developed, trained, and tuned a seq2seq model for translation. Deployed model using Docker with ECR, and accessed endpoint for inference through Lambda.
  • CloudFormation: Helped develop Bash scripts and a CloudFormation template to automate deployment and creation of architecture for the client.
  • Second Customer/Project: Helped build an application using high-level ML/NLP services. Developed Python AWS Lambda functions to work with AWS Comprehend for Sentiment Analysis, Entity Extraction, and Topic Modelling.
  • American Society of Clinical Oncology (ASCO)

    Data Science Intern June 2019 - August 2019

    Worked on the backend for the Targeted Agent and Profiling Utilization Registry (TAPUR) team.

  • Objective: Transferred data of patients and forms from over 110 registered sites from Syapse to a new platform in Rave EDC.
  • SAS/SQL: Utilized SAS Macro Language and PROC SQL queries to develop a program that parses through Syapse patient and drug data to create forms ensuring successful migration into the new data platform.
  • ML: Developed logistic regression and discriminant analysis models to evaluate factors of re-enrollment of patients in TAPUR cohorts.
  • Protocol Team: Communicated with the Data and Protocol Teams in agile meetings to understand the work-flow of patient and drug forms in the Healthcare Industry and earned a CITI certification.
  • Skills

    I have experience with implementing various Supervised and Unsupervised models such as MLR, Logistic Regression, k-Means Clustering, PCA, CART/Random Forest, and SVM. In addition, I am well versed with deep learning architectures such as ANN's, CNN's, and RNN/LSTM's as well as their application in fields such as NLP and Time-Series. Currently learning more about GANs and Reinforcement Learning.