All Work

Tree Trimming Eligibility Tool

Redesigning a confusing Hydro One form that wasted resources and frustrated customers.

Role

Product Designer & Developer (Primary)

Impact

40%
reduction in rage clicks
66%
fewer immediate exits
42%
reduction in dead clicks

Timeline

3 months (design, development, testing, launch)

Tools

Figma, HTML/CSS/JavaScript, Microsoft Clarity (behavioral analytics), internal user testing

View live page

The Problem

Hydro One's tree trimming request form was creating operational chaos. Users couldn't figure out if they qualified for service.

Business Impact
  • High volume of ineligible requests wasting forestry resources
  • Unnecessary site visits and administrative overhead
  • Customer service team fielding clarification calls
User Pain Points
  • Confusing diagrams that didn't clearly explain responsibility
  • Unclear language about eligibility criteria
  • Process felt like a guessing game
The Challenge

How might we help customers accurately self-assess eligibility before submitting requests, reducing false positives while maintaining accessibility for those who genuinely need service?

The Design Process

Research & Competitive Analysis

I analyzed how other utilities handled tree trimming eligibility, looking for patterns in how they helped users self-assess.

Key finding: Duke Energy used a numbered diagram with explanations below—the most intuitive approach we found. Users could match their situation to a specific numbered scenario.

Stakeholder presentation: Shared competitive analysis findings with the forestry department. They agreed this numbered diagram approach was clearest for customers.

Iteration 1: Initial Redesign

Approach: Simplify to one core question, move the screening process higher on the page (was buried halfway down), and use numbered diagrams with color-coded eligibility indicators. Also add in a timeline to show what happens after submitting the request.

Iteration 1: Lo-fi Design

Iteration 1: Eligibility Form Flow

I simplified the form and added color-coded visual feedback to make eligibility clearer.

Design changes made:
  • Simplified from multiple questions to one focused eligibility check
  • Numbered power line zones on a diagram (inspired by Duke Energy)
  • Green checkmarks for eligible scenarios, red X marks for ineligible
  • Moved eligibility tool to top of page for better visibility

Hypothesis: Color coding would help users quickly identify their eligibility.

How It Works


User sees diagram with red or green markers

Opens accordion to learn about markers

User reads details about their scenario

User confirms "This is my situation"

User clicks the green or red button based on eligibility

Testing Iteration 1 Revealed a Problem

Method: Moderated usability testing with 5 Hydro One employees (not from forestry department) over two weeks—week 1 tested the initial design, week 2 focused on refinements.

Test Scenario Examples:
  1. “You notice a tree overgrown near a power line, but it’s not sparking or burning. Use this page to find out if Hydro One will handle this request.”
  2. “Look at the diagram and decide if Hydro One is responsible for trimming a tree near the line labeled #3.”
  3. “Assume the tree affects a primary power line (#1). Complete the steps to submit a trimming request.”

What we discovered:

"I don't want to click the red one—it looks like an error"
— User avoiding the correct answer because of color coding

The real problem: Color was creating bias. Users avoided "red" options even when those options matched their actual situation. They were trying to pick the answer that would get service, not the answer that was true.

Additional issues:

Users initially missed the answer buttons—they didn't look like standard buttons
Users tried to click numbered circles on the diagram (they weren't interactive)
Required expanding accordions to understand each scenario
Decision-making was backwards: "Which one gets them to come?" instead of "Which one matches my situation?"

Iteration 2: Interactive, Neutral Design

1. Remove judgment, enable exploration
Instead of showing eligibility upfront with color, let users explore scenarios neutrally.

2. Flip the interaction model
Instead of asking users to answer questions, let them explore scenarios neutrally to find theirs.


Key Design Changes:

  • Made diagrams interactive
    Clickable zones reveal details on demand

  • Removed color bias
    Neutral presentation until user commits to their scenario

  • Honest self-assessment
    Users find their actual situation, not the "right answer"


How It Works


User sees neutral diagram with numbered zones

Clicks zone that matches their situation

Modal opens with clear explanation

User confirms "This is my situation"

Next steps provided based on outcomey

Iteration 2: User Testing

Tested the interactive prototype with a second group of 3 participants.

Results:

100% task completion rate
Users successfully identified their scenario
No confusion about eligibility
Positive feedback: "Oh, I can click these! That's helpful."

Conclusion: Users explored scenarios honestly instead of trying to game the system.

Design + Development

We designed and built this solution end-to-end.

This allowed rapid iteration. I was able to test interactions in the browser same-day, make changes based on feedback immediately, and ensure implementation matched design intent.

Built with: JavaScript, HTML5, CSS3.

Accessibility: WCAG 2.1 AA (keyboard nav, ARIA labels, screen reader support)

The Impact

Microsoft Clarity data comparing November (before) vs. December (after redesign):

37%
faster completion. Average time on page: 3.0 min → 1.9 min
40%
reduction in rage clicks. Rage click rate: 0.65% → 0.39% of sessions
66%
fewer immediate exits. Quick back rate: 36% → 12% of sessions
42%
reduction in dead clicks. Dead click rate: 29% → 17% of sessions

Ongoing measurement: Continuing to track long-term operational impact on invalid request volumes and resource efficiency.