meter showing maximum web traffic

INDUSTRY INSIGHT

4 ways agencies can ensure optimal website performance

Everyone can remember the painful start to the Affordable Care Act open enrollment in 2013, with the HealthCare.gov website down for extended periods as surges of users attempted to sign up for insurance. Many expected similar turmoil in 2017, given the increasing number of cyberattacks across the government and the administration’s decision to end open enrollment advertising.

But this was not the case. After the ACA enrollment closed in December and state exchange deadlines at the end of January, more than 8.8 million people had selected plans using HealthCare.gov, including 2.3 million new consumers and 6.4 million others renewing their coverage.

The improvement to the enrollment process was not an accident. The 2013 open enrollment demonstrated that agency websites must be stable enough to withstand massive influxes of visitors, while guaranteeing a smooth and secure online experience. The Centers for Medicare and Medicaid took a number of steps that other agencies can emulate as they prepare their own digital properties for anticipated -- and unforeseen -- surges in users.

1. Stay secure by implementing “defense in depth” architecture. Just like warfighters prefer to keep conflict away from critical assets, so too should agencies implement cybersecurity “at the edge” and combat bad actors as far away from an agency’s own internal system as possible. With technology that can analyze every IP address coming into a website, agencies can then evaluate the traffic at their networks' edge. Proactive edge strategies can minimize the number of attacks that actually reach the data center, ensuring malicious users cannot wreak havoc on a website’s uptime and performance.

2. Build in real-time scalability and flexibility. It’s difficult for any agency to project how many users a website will receive at a given time, and that’s why it’s essential to have a web architecture that’s flexible and scalable enough to meet any type of traffic surge. An effective web management solution must be able to adapt at a moment’s notice to traffic spikes and still provide appropriate levels of citizen services. Some events that are linked to the calendar, like open enrollment in this case, can allow agencies to anticipate increased traffic and plan accordingly. But in other cases, like a local agency weather service tracking a rapidly developing storm, IT managers cannot predict when traffic will surge. Thus, agency websites must have a scalable solution in place that gives them the flexibility to deliver optimal website performance. 

3. Enact efficient bot management. A key component of a robust web strategy is successful bot management, which includes identifying the bots that part of essential business tasks and those that are malicious. All bots “scrape” website data, but bad bots do so in a way that weakens the system's defense against cybersecurity attacks, which is especially critical  for agencies housing sensitive constituent information. And even the good bots can negatively impact a website’s performance if they arrive en masse. Too many bots trying to access a website at once can slow performance, resulting in a poor site experience for human visitors.

4. Make sure that code is up to date. Web standards quickly change as technology improves, resulting in regular update requirements to a website’s infrastructure. Ignoring regular updates can negatively impact an agency’s site performance. Incorrect or outdated code can slow down operations and increase a site's vulnerability to cyberattacks, so agencies keep current on updates. Code updates also help improve website performance for the increasing number of mobile users who access websites from their smartphones.

An agency can have personnel and processes in place to ensure that citizen-facing operations run smoothly, but if its website cannot achieve the performance needed for online services delivery, citizens will be frustrated. Putting in place cutting-edge technology and industry best practices to guarantee website security and speed will ensure that agency services run smoothly all while keeping constituents satisfied.

About the Author

Tom Ruff is VP Public Sector America's, Akamai Technologies.

inside gcn

  • Congressman sees broader role for DHS in state and local cyber efforts

    Automating the ATO

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group