Preview by Thumbshots.com Thumbnail Screenshots by Thumbshots

Mozilla Corporation Headquarters

Mozilla_MTV_EastEvelyn0Mozilla_MTV_EastEvelyn1Mozilla_MTV_EastEvelyn2Mozilla_MTV_EastEvelyn3Mozilla_MTV_EastEvelyn4Mozilla_MTV_EastEvelyn5Mozilla_MTV_EastEvelyn6Mozilla_MTV_EastEvelyn7Mozilla_MTV_EastEvelyn8
Mountain View, CA 2013 – 2014

Project Scope

MKThink has designed a 54,000 square foot office that serves as Mozilla’s new headquarters in Mountain View, California.

“Coded” with the same passion, open-source, and collaboration methodologies that Mozilla uses to create its software products, the new space embodies the company’s culture of openness and transparency. Recognizing the important contributions of Mozilla’s extensive volunteer community while simultaneously enhancing the capabilities of full-time staff to collaborate effectively became the driving force in planning the new space.

Reworking an existing concrete shell originally built in 1982, the MKThink team engaged more than 200 Mozillians before beginning the design process t decide how to best create a space that everyone would be excited about and could work efficiently within. Online surveys, group and individual interviews conducted in person, all hands presentations to the entire staff, and follow-up email exchanges yielded useful insight that was woven directly into the design.

The two story glass wall at the entrance reflects Mozilla’s cultural philosophy of openness and transparency. The common space is central to the Mozilla community- it provides a space for all-hands meetings, for volunteer engineers to work and interact with full-time Mozillians, and the video wall at left allows offices from around the globe (including Vancouver and London) to log in and connect with the Mozilla HQ. The open stair links the two floor plates for increased face-to-face interaction. The entry space offers a glimpse into the office beyond, with easy access to bike storage and different levels of security access for employees and volunteers.

 

Watch this video for a more in-depth analysis of the space, where MKThink Principal Steve Kelley talks about our user engagement and design processes.

Every Corner, Every City

Crowdsourced air quality sensors provide opportunity to fill gaps left by the Environmental Protection Agency (EPA) in data resolution and availability
by Sean Dasey

 

Fire at the Chevron Refinery in Richmond, CA.

Fire and smoke erupted from an accident at the Chevron Refinery in Richmond, CA, in August 2012.
Source: http://blogs.sfweekly.com/thesnitch/2012/08/fiery_explosion_rocks_east_bay.php

In 2014, the American Lung Association ranked Fresno, CA as the most polluted of 277 metropolitan areas in America for 24-hour particle pollution. [1] Salinas, CA was ranked the cleanest. Fresno and Salinas are only separated by 100 miles. This underscores the fact that air quality can vary greatly across nearby regions.

Although the current EPA air quality monitoring network captures differences across regions, it is not equipped to capture differences across localities and neighborhoods. For example, the 9-county San Francisco Bay Area, home to 7.5 million residents, has only 39 stations. In fact, this network failed to provide sufficiently detailed neighborhood scale air quality data during the August 6, 2012 Chevron refinery fire in Richmond, CA (pictured above), after which 11,000 nearby residents made emergency room visits for respiratory issues. [2] The closest EPA particulates monitor was located 2 miles away from the refinery, and the results took two weeks to analyze.

Historically, the EPA has focused on capturing air quality data on a regional scale by placing several monitoring stations per county, away from major roads or industrial emission sources in order to measure region-wide averages. The current EPA network of stations, however, is not designed to reveal how pollution sources affect the air quality of nearby neighborhoods with any real specificity.

Regional Weather Stations

Map of Bay Area District Air Monitor Sites; the dots on the map indicate where at least one full year of high quality wind speed, wind direction, and temperature data are archived that are suitable for modeling purposes. Source: http://hank.baaqmd.gov/tec/maps/dam_sites.htm#

One possible solution to this problem would be to crowdsource cheaper sensors to fill in the gaps between sporadically placed EPA stations.

In 2012, an open-source carbon monoxide and nitrogen dioxide sensor called the Air Quality Egg was successfully crowd funded on Kickstarter, and was named “Best of Kickstarter: 2012”. Although the Egg is not calibrated and therefore doesn’t give accurate readings on an individual basis the way EPA sensors do, it is several orders of magnitude cheaper than a calibrated sensor. This gives citizen scientists the opportunity to deploy them at a much larger scale than the existing EPA station network.

Egg Systems Diagram

The Air Quality Egg began as a Kickstarter-funded sensor that is far less expensive and easier to use than traditional air quality monitors.
Source: http://airqualityegg.com/

Louisville, KY, a historically industrial city that was once described as “smoky and blackened” by a visiting Charles Dickens, is emerging today as a leader in utilizing Eggs to monitor air quality on a neighborhood scale. On April 24, 2014, the nonprofit Institute for Healthy Air, Water and Soil, led by philanthropist Christy Brown and endorsed by Louisville Mayor Greg Fischer, announced the deployment of 100 Eggs around Louisville. [3] They will be deployed strategically, in neighborhoods downwind of heavily industrial zones.

The Egg is one example of a crowdsourced sensor contributing to an emerging network of sensors providing environmental and health data at the neighborhood scale. In Louisville, the Institute for Healthy Air, Water and Soil has partnered with the Louisville Asthma Data Innovation Project to correlate Egg data with time and location data from Bluetooth enabled asthma inhalers. As opposed to the traditional method of accumulating respiratory health data from county hospitals to get regional-scale statistics, this new method will lead to more precise location based data to pinpoint specific neighborhoods where air quality and respiratory health are concerns.

Athsma Incidence in Louisville, KY

Hotspots of Athsma incidence in Louisville, KY.
Source: http://www.slideshare.net/HealthDataConsortium/louisville-asthmapolis

Using RoundhouseOne’s proprietary data management system, 4Daptive, we can analyze correlations between data from the Egg we’ve deployed in Louisville and our larger sensor network, measuring thermal comfort, foot traffic, and outdoor weather conditions. 4Daptive provides the ability to manage data through an organized database, analyze correlations between multiple data sources, and produce user-created charts accompanied by customizable statistical outputs.

Limitations of available data have traditionally forced air quality analyses to be done on a regional scale. The emergence of localized, neighborhood-scale data will provide opportunities for new insight on how air quality affects our daily lives.

Sean Dasey is a Data Analyst at RoundhouseOne, MKThink’s in-house data analytics team. His work focuses on studying the effect of building design on thermal comfort, indoor air quality, and energy use.

FOOTNOTES

[1] American Lung Association. “State of the Air 2014.” April 30, 2014. http://www.stateoftheair.org/2014/city-rankings/
[2] Bruggers, James. “Looking for Air Pollution Hot Spots with Micro-Monitors.” The Courier-Journal dopen-source April 27, 2014. http://www.courier-journal.com/story/tech/science/environment/2014/04/26/air-quality-eggs-louisville/8174967/
[3] Bulwa, Demian & Kane, Will. “Refinery Smoke Blew Past Air Monitors.” San Francisco Chronicle August 29, 2012. http://www.sfgate.com/bayarea/article/Refinery-smoke-blew-past-air-monitors-3800068.php

USEFUL LINKS

Air Quality Egg location visualization and information: http://airqualityegg.com/
Institute for Healthy Air, Water, and Soil, with interesting data on air quality in Kentucky: http://www.instituteforhealthyairwaterandsoil.org/
State of the Air, by the American Lung Association: http://www.stateoftheair.org/

 

A Closer Look at Mozilla’s New MKThink-designed Workspace in Mountain View

Mozilla’s New Workspace in Mountain View from Nexus 1 on Vimeo.

Mozilla's new workspace in Mountain View, California was coded with the same care, passion, open source, and collaboration methodologies for which Mozilla codes its software products. This clip provides a glimpse into the strategic and creative process that maps Mozilla's unique culture into collaborative workspace.

MozillaThumb

Drought Awareness: Data is Emerging, Design Should Follow

Christopher Damien by Christopher Damien

It’s hardly news that California is in the throes of a serious drought. California’s final Department of Water Resources snow survey of 2014, published on May 1, reported that the statewide snowpack’s water content is at 18 percent of average for the date. Such arid circumstances were anticipated after an April 1 snow survey found water content was only at 32 percent. This is troubling news considering that California receives about a third of its hydration from these water-containing snowpacks.(1)

Water agencies have experimented with and implemented several methods for budgeting water. “Allocation pricing,” for example, is a method of budgeting water in terms of how much users ought to be using; based on geography and demographics, a user is allocated a certain amount of water. With overconsumption rates increase dramatically. With these methods, water agencies are attempting to fiscally wake consumers to the severity of our current situation. However, consumers in the Bay Area have not yet cut water use by the 10-20% requested by San Francisco Public Utilities Commission. Rationing this resource will certainly prove to be a challenge for the Californian’s varied degrees of thirst.

Human behavior will be the most difficult barrier to water security. As accurate monitoring increases our awareness of the impacts of overconsumption, the design of our built systems must follow suit. This necessary shift will only result from a clear evaluation of the various water realities throughout California.

Can design not only make systems more efficient, but make consumers more aware of how precious this resource is becoming?

Design to adequately address water scarcity must be rooted in data. The first step will be to raise awareness of where water is coming from, then reevaluate the practicality of these distances.

Where is your water coming from?

SPUR_HetchHetchyDiagram of the Hetch Hetchy Water System, courtesy of SPUR

San Francisco receives 80% of its water from the Hetch Hetchy Project, requiring transport of over 150 miles. This transport of water over a large distance is a peculiar characteristic of urban centers throughout the American West, one that is largely an artifact of yester year’s inclination for grand, if not hubristic, engineering.

What are the opportunities other than major engineering feats of piping water from distant climes?

This is the design challenge posed by Peter and Hadley Arnold of the Arid Lands Institute, who recently unveiled their program for design that substantially accounts for both geographic aridity and actual local rainfall in Southern California’s San Fernando Valley Basin, entitled “The Case for Divining LA.” In it, they exhibit a model of storm water runoff based on 30-year precipitation data, visualizing the path of runoff and opportunity for harvest and use. This high resolution geo-spatial model is part of a larger effort to visualize Southern California’s water reality: “520,000 acre-feet of unused stormwater is sent as discharge to the Pacific Ocean each year, enough to support 500,000 families at current usage rates with no conservation measures in place.”(2)

Their model includes surface runoff as a result of precipitation, surface permeability, and soil types and conditions. This model led the Arid Lands Institute to conclude that “urban stormwater and recycled municipal supplies combined with increased efficiency could meet up to 82 percent of Los Angeles’ water demand,” 82% that would not need to be piped via the 400 mile Los Angeles Aqueduct.

AridLandsInstituteGeo-Spatial Model of Los Angeles Water Sources, courtesy of  Arid Lands Institute

Efforts like this will be needed across geographies and municipalities throughout California and throughout the heating world as drought and aridity become more prevalent characteristics of life. These modeling efforts offer awareness of real resource surplus and scarcity, allowing design solutions to be based in reliable data.

In our own work, MKThink employs evidence-based design practices and seeks to enable user behavior through design rather than force it. We ask, how might design offer aesthetic awareness of drought? How might design offer awareness of distant geographies impacted by exorbitant consumption? How might we avail ourselves of the missed opportunities outside our doors?

Real knowledge of a legitimate drought and real knowledge of consumption patterns and sources will hopefully allow people to quench their thirst accordingly and stop watering their lawns; above all, it may finally force people to take responsibility for where they decide to put down roots.

FOOTNOTES:
(1) California Department of Water Resources (DWR), “Year’s Final Snow Survey Comes up Dry: 3-Year Drought Retains Grip as Summer Approaches”
(2) Arnold, Hadley and Peter, “Pivot: Reconceiving Water Scarcity as Design Opportunity: Mapping a More Absorbent Landscape,” BOOM Fall 2013, pgs. 95-101

USEFUL LINKS:
Navigating the various sources of California’s water: Water Education Foundation.
High-Resoution Geo-Spatial Model of SoCal’s Water Reality: by Arid Lands Institute.

Look Who’s Talking – Energy Conservation Edition





by Mark R Miller, AIA

CEO, MKThink

Lets start with a test of your GREEN I.Q.: What institution has this institutional priority?:

“…More strategic use of energy resource…lowering risk…saving money…and allowing the department to shift more resources to other…priorities. Such efforts are critical if we are to meet our mission to prevail, today and in the future.”

A. State of California: Board of Regents

B. US Department of Energy

C. US Green Building Council

D. University of California – San Francisco (UCSF)


E. US Department of Defense

The answer is in the full quote:


DoD’s Operational Energy Strategy will guide the Defense Department to a more strategic use of energy resources in the fight today and in plans for the future by lowering risks to our warfighters, saving money for American taxpayers, and allowing the department to shift more resources to other warfighting priorities. Such efforts are critical if we are to meet our mission to prevail, today and in the future. – US DEPARTMENT OF DEFENSE

For more go right to the Department of Defense website: http://www.defense.gov/home/features/2011/0611_energy/

Yes, energy management and more specifically significant reduction of fossil fuels, is a non-political, mission crucial objective promoted by the rather senior Secretary of the United States Navy Ray Mabus. (refer to an NPR interview with Secretary Mabus here:

http://www.npr.org/2010/12/03/131785448/Military-Goes-Green-For-An-Edge-On-The-Battlefield )

Why? Well it is more than a public relations initiative. According to the Department of Defense it is rather straight-forward assessment: reduced reliance on fossil fuels will increase mission effectiveness, save lives and save money – not a bad trifecta. This assessment is the basis behind high level strategic planning that is reshaping the military’s approach to everything from advanced research to forward-operating bases operations. The US Department of Defense provides more more detail on the strategy role of energy in this report:

http://energy.defense.gov/OES_report_to_congress.pdf

This recognition by the DoD is important in many ways. Some are obvious: such a large and influential institution as the DoD supporting clean technologies will be a big morale boost to emerging clean technologies and ongoing research. The military offers a large market for commercially viable (and domestic!) clean technologies. It also provides mission-critical venues to explore emerging technologies, accelerating their testing and potential for commercial viability.

There are deeper benefits: This decision come from deep and data-driven analysis of the impact of fossil-fuel energy patterns of military operational effectiveness. This is not a political decision. Rather the assessment findings have had to overwhelmingly indicate the cost of the prior direction to overcome a red-leaning culture that has dismissed, and would have been expected to continue to dismiss, energy as a relevant issue.

Nothing like a bit of solid data-driven analysis presented by a respectable institution to fundamentally change the debate.

For more on the Department of Defense strategic assessment of energy reference the following article links:

http://www.defense.gov/home/features/2011/0611_energy/

http://www.npr.org/2010/12/03/131785448/Military-Goes-Green-For-An-Edge-On-The-Battlefield

http://www.greencarcongress.com/2011/06/dod-20110614.html

http://science.dodlive.mil/2011/06/14/energy-for-the-war-fighter-the-dods-operational-energy-strategy/

http://www.cnn.com/2011/09/22/opinion/cuttino-militarygreen/index.html?iref=allsearch

http://apps1.eere.energy.gov/news/news_detail.cfm/news_id=17763


http://www.sustainablebusiness.com/index.cfm/go/news.display/id/23039

http://www.greentechmedia.com/articles/read/will-the-military-be-the-bridge-to-the-u.s.-renewable-energy-future/

http://www.fiercegovernment.com/story/dod-position-incubate-clean-energy-says-pew-report/2011-09-27