IrvineHacks 2026 gathered hundreds of student hackers at the UCI Student Center from February 27–March 1, 2026 for an engaging weekend filled with creativity, collaboration and rapid prototyping. As a proud sponsor, Melissa for Education participated on Day 1 to inspire attendees to explore the potential of combining innovative ideas with real-world data.
Our team, consisting of Chief Data Officer Daniel K. Le, Software Engineer Team Lead Ha Phan, Software Engineer II William Huynh, Software Engineer Jessica Susilo, and Operations Assistant Sihan Wu, tabled on opening day with a simple mission:
- Introduce hackers to Melissa for Education
- Share how our free data portal supports academic projects and research
- Make sure everyone walked away with something fun and useful to remember the experience
Throughout the evening, the team met students brainstorming on project ideas across AI, real estate, climate risk, mapping and analytics. They discussed how access to clean, production-grade data can transform a hackathon project from a mere concept into something truly demo-ready.

Swag, Conversations and Data at the Melissa Table
Day 1 was all about connecting in person. At the Melissa Education table, attendees stopped by to:
- Pick up branded tote bags, notebooks and sticker sheets
- Learn how to get started with the Melissa Edu data portal for free
- Ask questions about incorporating address, geo, consumer and property data into their projects
The swag was a hit, but the real highlight was the conversations: hearing students talk through their technical challenges, brainstorming ways to enrich their ideas with better data, and helping them understand how real organizations rely on data quality and verification every day.

A Flexible Approach: Open Access to APIs
This year, Melissa Education did not host a formal data challenge track. Instead, we focused on giving hackers maximum flexibility to use our tools in whatever way best fit their ideas.
Throughout the datathon, teams were invited to request access to Melissa APIs by posting in the #sponsor-melissa channel on the event Slack. From there, they could:
- Ask for help integrating address verification and geocoding
- Receive guidance on how to use data quality and enrichment APIs in their stack
This open, request-based model allowed teams to pull Melissa into their workflow organically, when and where it made the most sense for their project.

Spotlight: ClimateCheck – Climate Risk Scoring for Properties
One standout team that took advantage of Melissa APIs was the creators of ClimateCheck, a project submitted to the IrvineHacks 2026 DevPost showcase as a climate risk scoring tool for residential properties.

What ClimateCheck Built
ClimateCheck helps prospective homeowners understand how vulnerable a given property might be to wildfires, floods and landslides. By entering an address, users receive a detailed risk report that includes:
- Normalized risk scores (0–100) for multiple climate hazards
- The distance to the nearest wildfire
- Risk probability over time
- Practical recommendations for how to prepare for or mitigate potential climate risks
It’s the kind of information that can help buyers make more informed decisions about where they live and how they plan for the future.
.jpg?width=806&height=440&name=gallery%20(2).jpg)
How They Used Melissa APIs
To power their analysis, the team used Melissa’s Global Address Verification API as a foundation for their data pipeline:
- Users enter a property address into the ClimateCheck interface.
- Melissa’s Global Address API parses and verifies the address, returning latitude and longitude coordinates.
- Those coordinates are then fed into a set of government and commercial climate-related APIs, including FEMA flood data, USFS wildfire hazard data, USGS elevation and landslide inventories, NIFC wildfire perimeters, and NOAA weather alerts.
- The team aggregates these sources into a custom climate risk scoring formula, generating a unified 0–100 score for each hazard and passing the result to a Google Gemini model to produce explanations and recommendations using colloquial English.
.jpg?width=806&height=440&name=gallery%20(1).jpg)
On the technical side, ClimateCheck combined a React + TypeScript + Vite frontend with a Python FastAPI backend, using async HTTP calls to parallelize data retrieval and Recharts for visualization.
Why It Stood Out
ClimateCheck is a great example of what we love to see at datathons:
- A real-world problem (climate risk for homeowners)
- A thoughtful data pipeline grounded in verified location data
- A smart combination of APIs, AI, and visualization to deliver insights users can actually understand and act on.
We’re proud that Melissa’s data provided the geocoding and address foundation that made this multi-source climate analysis possible.
.jpg?width=806&height=440&name=gallery%20(3).jpg)
Explore All the IrvineHacks 2026 Projects
ClimateCheck was just one of many innovative submissions at this year’s event. To see the full range of projects—spanning AI, sustainability, UX, real estate and more—visit the official IrvineHacks 2026 DevPost page.
You’ll find an impressive collection of ideas that truly live up to the IrvineHacks motto: Create. Connect. Inspire.
Keep Building with Melissa for Education
Even though the hackathon weekend has ended, your work with real-world data doesn’t have to.
The Melissa for Education program gives students and faculty ongoing, free access to:
- Curated datasets (addresses, property, geo and more)
- Data quality and enrichment APIs used in B2B environments
- Tools to help you bring enterprise-grade data practices into class projects, capstones and research
If you enjoyed experimenting with APIs at IrvineHacks—or if you didn’t get the chance to during the event and want to explore now—we’d love for you to continue building with us. Learn more and sign up here! Have specific data needs? Contact the Melissa Education team at education@melissa.com to discuss custom dataset options.
We’re grateful to the Hack at UCI organizers for another inspiring year of IrvineHacks, and we’re already looking forward to sponsoring and supporting future datathons. Until then, keep experimenting, keep learning, and keep using data to build the technology of tomorrow.
