Chris Padilla/Blog

You can follow by RSS! (What's RSS?) Full posts list here.

    Polymorphism in Java

    Of the core pillars of Object Oriented Programming, Polymorphism affords a great deal of flexibility. While inheritance allows for encapsulated and abstracted data to be shared, polymorphism allows for the redefining of both the data and the attached methods based on context.

    Java supports polymorphic behavior through two means: Method Overloading and Method Overriding.

    Demonstration

    Say that I have an ArchiveItem class that stores the basic details of a name and id. Here I'll demonstrate both overloading and overriding:

    package polymorphism;
    
    public class ArchiveItem {
        private String id;
        private String name;
        
        // Overloaded constructor with all instance variables supplied
        public ArchiveItem(String id, String name) {
            this.id = id;
            this.name = name;
        }
    
        // alternative constructor with only the name provided
        public ArchiveItem(String name) {
            this.name = name;
            this.id = IDPackage.generate();
        }
    
        // Override the base equals() method
        @Override
        public boolean equals(Object compared) {
            if (compared == this) return true;
            if (!(compared instanceof ArchiveItem)) return false;
    
            ArchiveItem comparedItem = (ArchiveItem) compared;
    
            return this.id.equals(comparedItem.id);
        }
    
        // Override the base toString() method
        @Override
        public String toString() {
            return String.format("%s: %s", this.id, this.name);
        }
    }

    Taking a look at my constructors, I'm overloading the ArchiveItem method by declaring the same method, but with a different number of arguments both times. When instantiated, Java will know which of the two methods to run based on what arguments are provided. Another way of putting it: One of the two methods will be called depending on their signature.

    Looking further down, I've written an equals and a toString method. All Objects in Java come with these methods. Every class inherits from the base Java Object class, and on that class are implementations for equals and toString. In fact, toString is what's called anytime you print an object to the console.

    Without any adjustments, passing an object to System.out.println() would return something like this:

    ArchiveItem guitar = new ArchiveItem("Guitar");
    System.out.print(guitar);
    // "polymorphism.ArchiveItem@28d93b30"

    The base print method will print the classname to the left of the @ symbol and the location in memory to the right. Typically, we want something more descriptive representing our class instance.

    In the example above, I'm pringint instead the provided id and name of the ArchiveItem

    By adding the @Override annotation, I'm declaring that I'm intending to implement my own logic for the already inherited toString method. The @Override annotation is actually not necessary, but recommended. This will flag to the compiler to check that you're in fact overriding an existing method. Great for catching typos!

    Putting It All Together

    ArchiveItem piano = new ArchiveItem("Piano");
    System.out.print(piano);
    // "Piano, 93nkf903f"

    Here it is in action! The ArchiveItem is instantiated with only one argument, so it calls the matching method. One line down, I'm calling my implementation of toString by passing my piano object into the print method.

    Here is the same class but with a different constructor signature:

    ArchiveItem piano = new ArchiveItem("custom-id", "Piano");
    System.out.print(piano);
    // "Piano, custom-id"

    Gwynn — Woodland Waltz

    Listen on Youtube

    A sweet dance from one of my Mom's old piano books 🦌

    Still Life

    💡

    Starting to do color studies!

    Following the excellent Ctrl Paint series.

    Deploying Docker Compose Application to AWS EC2

    Many deployment platforms (Vercel, Heroku, Render) add a great amount of magic and convenience to the process of publishing a web app. But is it all that difficult to work without some of the tooling?

    I wanted to find out. So this week I put on my DevOps hat and decided to get my hands dirty!

    My aim was to take an app I had built, wrap it up along with a database into a docker container, and deploy to AWS. All without extra bells and whistles — No Fargate, no Elastic Beanstalk, no CI/CD integration just yet. Just a simple Linux server on EC2!

    In this case, it's a Java Spring Boot app with a PostgreSQL db. Though, since it's wrapped up in docker compose, this post will apply to any containerized app.

    Here we go!

    Local Setup

    Write the Dockerfile

    Assuming I already have my app built, we'll write the docker file for it. I'm going to store it under src/main/docker for organization. We'll also keep it pretty simple for the application:

    FROM openjdk:17-oracle
    COPY . /app
    ENTRYPOINT ["java", "-jar", "app/app.jar"]

    All that's happening here is I'm using the Java image for the version I'll build with. Then I'll copy the contents into the container. And lastly, I'll kick off the program with java -jar app/app.jar

    Build the Executable

    If you're not running Spring Boot, feel free to skip ahead! Here's how I'm setting up my executable:

    To build my app, I'm going to run mvn clean package. This will generate a jar file in my target folder. From there, I'll simply move it over to the docker directory with the linux command:

    cp target/demo-0.0.1-SNAPSHOT.jar src/main/docker/app.jar

    Write the Docker Compose Config

    Next is the docker compose file. This is where I'm bringing in the PostgreSQL db and wrapping it up with my app. Here's the file:

    services:
      app:
        container_name: spring-boot-postgresql
        image: 'docker-spring-boot-postgres:latest'
        build:
          context: .
          dockerfile: Dockerfile
        ports:
          - "80:80"
        depends_on:
          - db
        environment:
          - SPRING_DATASOURC_URL=jdbc:postgresql://db:5432/compose-postgres
          - SPRING_DATASOURCE_USERNAME=compose-postgres
          - SPRING_DATASOURCE_PASSWORD=compose-postgres
          - SPRING_JPA_HIBERNATE_DDL_AUTO=update
        
      db:
        image: 'postgres:13.1-alpine'
        container_name: db
        environment:
        - POSTGRES_USER=compose-postgres
        - POSTGRES_PASSWORD=compose-postgres

    app and db are the individual images here for my container. For each, I'm pulling the relevant dependency images for Spring and PostgreSQL respectively. Under app.build We're setting the context to be the current director (src/main/docker) and pulling the docker file from there.

    A few areas specific to my setup:

    • Spring Boot runs on port 8080 by default. In my configuration application.properties, I've set the port to 80. This is the default HTTP port and makes it so that, on the EC2 server, I'll be able to access the app. Otherwise, instead of "myapp.com", I would have to access "myapp.com:8080". To match both within and without the container, I'm setting the port config.
    • I'm setting my environment variables on both. The default port for PostgreSQL is 5432, so that's where the db url points to.
    • Hibernate is an ORM for Java objects to SQL/relational databases. Here I'm specifying that Hibernate should update the SQL schema based on my applications model configuration.

    AWS Setup

    At this point, I'll point you to the AWS docs for setting up an EC2 instance. Here's the gist:

    • Ensure you have a VPC created. The default is fine if you have it.
    • Instantiate your EC2, configured to Linux.
    • Generate your key pair
    • Edit the security group to allow inbound HTTP requests

    Once your EC2 is up, it's time to SSH into it!

    SSH and Installs

    From your local machine, grab your key pair as well as the public DNS addres. (You can find instructions on the instance page after clicking "connect")

    ssh -i /main-keypair.pem  ec2-user@ec2-34-75-385-24.compute-1.amazonaws.com

    The most magical part to me: after that, you'll be logged in and accessing the Linux terminal on your server!!

    Since it's simply a Linux server, we can install all the dependencies we need just as if we were doing it on our own machine.

    From here, install:

    • Docker
    • Docker Compose
    • Git
    • Maven (or whichever build tool you are using)

    After that, here's how we'll get the code onto our server:

    • Add the current user to docker: sudo usermod -aG docker $USER sudo reboot
    • Clone your git repo to the server (prereq: Upload your project to GitHub!) git clone ssh://john@example.com/path/to/my-project.git
    • Build the application locally mvn package
      • We'll have to move the jar file to the docker directory once again.
    • Navigate to the docker directory. cd src/main/docker
    • Build the docker image docker-compose -f docker-compose.yml build
    • Run the container with docker-compose up or docker-compose up -d to run in the background and keep it running after you exit the server.

    After that, accessing the public DNS address should show your app up and running!

    Automation

    Now the app is up! However, what if we need to make changes to the app? It's not a back-breaking process, but it would involve a few steps:

    • Git push changes
    • SSH back into the server
    • Clone the repo
    • Rebuild the executable
    • Rebuild the docker image
    • Rerun the docker container

    Something that is certainly doable in a few minutes. But it screams for automation, doesn't it?

    The next step for me would be to embrace that automation. Knowing the steps individually for deploying an app to a Linux server, I would be taking a look at tools such as GitHub Actions or CircleCI to automate much of this process.

    Then, of course, there are many more considerations for a real world app. Performance monitoring, error logging, automatic scaling, load balancing — just to name a few!

    It was great to take a deep dive on deployment in isolation! On to exploring further tooling to support that process.

    White Coat

    Woohoo!

    Celebrated several things these past few weeks — including Miranda's white coat ceremony!!