The Tulum Raiders have been hard at work thinking of ways to improve the Tulum Neighborhood Portal. After receiving feedback from E.Y. Venture Capitalists, we have made some adjustments to further focus the aim of our website, as well as increase the user-friendliness of or product.
Clarifying Our Purpose
There are 2 main purposes of our website:
1) Map existing community resources and
2) Identify future needs.
- Asset map and public information API serves for purpose 1
- Poll and comment serves for purpose 2
- The result of the poll, and comments will not be mapped
Change in API
Rather than utilize the Twitter API to simply list real-time Tweet feeds, we have decided to employ the ESRI Public Information Map API. This is different in that it funnels other social media feeds (Tweets, YouTube videos, Flikr posts) and maps them as points with clickable info windows containing the posted info based on a geo id. This will better serve Tulum community members; if folks tweet about an event at the park, the event’s general location will be mapped linked to the tweet topic.
To accommodate for the mapped points mentioned above, we have decided to make a separate toggled layer called : Tulum Social. Our community resources will also then be made a separate layer that both can be turned on and off to avoid crowding on the map.
In an effort to make the design more user friendly, we have decided to use a one column layout. This will allow for a more natural placement of our toggles and polls, while leaving the bottom open for public comments. We are also in the process of exploring the use of a photo image as the backdrop for the web page.
- Airnowgateway.org does not consider weather conditions—even temperature— so the Google weather layer is an ideal compliment to our new air quality data.
- Aim to incorporate wind data vectors
Particulate matter data
- Airnowgateway.org provides real-time or historical air quality data at the zip code level for a five parameters: PM 2.5, CO2, CO, N20.
- Still trying to work on how to incorporate this into the map, and how to combine with the weather/wind data.
These are some of the primary variables we’re trying to consider on the map:
- Wind – should this be multiple arrows, or a single arrow for each designated area?
- Temperature – at what level does this cover?
- AQI – how would we display this in a clear and concise way that people can understand. Should we incorporate a legend/key?
ARC GIS Layers/Shapefiles we’re considering adding to the map:
- Elementary/secondary schools
- Parks/County Parks
- Zip-codes for LA County
- In the beginning, we opted to combine multiple layers on and off. Now we’ve decided to have individual layers that can be clicked on and off (at least for the midterm).
- Have pre-designated variables that are applicable to each user:
- Still allow users to add additional layers where they see fit/necessary
- Adaptable buffer – able to change the size of the buffer
- Haven’t been able to figure out how this would correlate with the weather and particulate matter data (would making the buffers combine temperature/particulate matter values, etc.)
- Designating which features will be permanently displayed and which features can be altered by the users
- Trying to assimilate at what spatial scale the weather and particulate matter data are presented at
- Which layout would be the most efficient and straight-foward in terms of allowing the designated users to get the most out of the data
- If and when we consider adding forecasts, how would we display this?
Updated wire frame diagram:
Things in the T.E.S.L.A. collaborative have been exciting lately! Last Wednesday we had a meeting with the Institute of the Environment and our partners at UC Davis. On the conference call we discussed our data set, variables, and important features of the website. Today (Tuesday), we received the data set that we need for our midterm project. We have been brainstorming design features of our website and are meeting tonight to make them functional!
One concern that we have is in regards to privacy of our results. We need to find a way to create a “draft” overlay watermark so that it is clear that our analysis is preliminary and not finalized (this is per the request of the Institute of the Environment). Also, we would like to password encrypt our website to limit access to EY Ventures and our team members (again, at the request of the client).
Update (Tuesday at 9:30 pm):
At our meeting tonight we discussed each of our responsibilities as we proceed.
Website components we want to include for the midterm:
- Password Protection
- Draft watermark
- Dropdown for time
- Integrating GIS map data
- Aggregating zones
Website components we want to include for the final:
- Address locator
- Google Charts API
Team member responsibilities:
- Jacki- GIS data integration w/ website
- Kristen- LEED buildings KMZ file, help with GIS data, written portions of website
- Kyle- Design website framework, add in each component
- Zhongbo- Create T.E.S.L.A. logo, create the title banner that says “City of Los Angeles Energy Consumption” with scrolling pictures
Questions for EY Ventures:
- Our GIS files are very large (and this is an understatement)… where can we store them? Can we have a pw protected portion of the ftp site?
- GIS- when we use graduated colors for energy use, we get an error message. What does this mean? Also, there are gaps in our maps even though the data is present.
- Tilemill–> is this a good resource? What are the advantages and disadvantages? Can we integrate Google features in Tilemill? Does Google/GIS allow us to aggregate the regions as we zoom in/out?
- How do we aggregate data as we zoom out? How do we do this so that it’s smooth and doesn’t take a long time to load?
- HOW do we integrate GIS data into google maproom? We have LOTS of data and it needs to be fast- is this possible? What are our options?
- Talk about the midterm and final requirements and possible variations from requirements (ie- switch the timeline for some components, API use, etc)
We also re-worked our website design
With feedback from E.Y. Ventures, GeoStories has decided to change/add the following:
1) We will focus more of our efforts into website functionality, and add design features later as time permits. As a result, new wireframes were created.
Brief description and instructional video. Will add a ‘sign in’ feature to have a GeoStories account to create/edit stories. To be Added: Navigation bar with ‘Sign In’, ‘Start Your Story’, ‘Search/Story Library’, ‘Contact Us’.
Input Page: Here is where our clients will build/edit their stories after signing in. The ‘Sign In’ page will require users to use their Gmail accounts to use their Picasa accounts where their personal pictures and videos are stored.
The Input Page will ask for information such as Place/Address, Time, Media Upload (Pictures/Video) and finally Chapter Text. Option to add as many ‘Chapters’ as the story permits (denoted by carrots on either side). Concerns: Converting Place/Address information to latlong, geocoding pictures/video.
Story Page View: Prototype story view after client has input a minimum of 3 Chapters. To be Added: Once you click on a chapter, icon on map will appear with info window pop up with media and chapter text with option for multiple pages/related media. As you move through the chapters, map will move (possible line linkage between icons/nodes) to next icon with new info window.
2) Still researching: Use of Picasa and Youtube API’s. Lat/Long conversion, Infowindow with text/media and mapping movement between icon/nodes. Will look into other potential API sources and how to further simplify our design.
3) Will start building forms (input page) and test story building functionality.
This week, FNR Consulting has implemented basic research toward launching the Beta site scheduled for next week. Based on the advice from Yoh and Maddie and group discussion, FNR has decided to introduce new API, Yelp API to its website. New wireframe is as follows. The basic idea of FNR’s website remains same. It allows users to map the pedestrian shed surrounding LACMTA bus stops and rail stations in addition to trip planning function. It also provides specific business information within the pedestrian shed.
Weeks 4 and 5 will be dedicated to collecting interesting photos of MacArthur Park and its immediately surrounding areas. The photos will be tagged with many keywords such as decade, subject matter, location (coordinates if possible), etc. I expect to have several dozen of them tagged and ready by the Week 6 presentation.
This week Fort Awesome Inc., divided the project elements, with each member focusing on a specific task for the upcoming beta launch. In addition, based on the board’s comments, we decided to implement two APIs initially: the Flickr API and the Instagram API (if time allows, we also hope to get a social media API – probably Foursquare- into the beta. We will also work towards a method to allow uploads from the audience who want to add user-generated data. The beta will also have placeholders for the other APIs that the site will eventually call on.
Group Member Updates
- Justin has been going through the LAPL photo database, selecting images from various decades in and around MacArthur Park. He has been uploading the images to Flickr and tagging them by topic, decade and location. We hope to have a sizable collection to display for the beta launch.
- Roy has been reading the documentation on the APIs we are using, determining the various calls and functions each API uses and selecting which of those features we will need for the site.
- Daniel has started to code the skeleton of the site itself.
- Alex has researched available demographic and socioeconomic data. He soon realized that the Census only has Excel separated data for 1990 and 2000. Although he located hundreds of pages (and GBs) of scanned censuses going back to America’s early days, most of these focused on the county or city level and were not excel-ready. Fortunately, our board member Yoh Kowano shared information on the demographic-rich Hypercities website. Once we receive the API information, we can hopefully call upon much of their data for our pre-1990 socioeconomic layers.In the meantime, Alex worked with 1990 and 2000 data, preparing the Excel files for GIS use. He organized total population, race, median household income and employment status by census tract. In addition, he aggragated various employment status columns to calculate percent of the work force employed and unemployed. Then he joined the datasets with census tract shape files in GIS and has begun to prepare the layers for kml export.
This week Fort Awesome Inc. will combine like Voltron, sticking together our individual week tasks into a cohesive whole. As this process will probably run into many roadblocks, we anticipate some site functionality and deliverable revisions. However, we are confidant we have the groundwork to put it all together and display a lovely working blueprint of Secret City!
Secret City will attempt to use the flickr API and instagram API to facilitate the display of pictures from particular generations (flickr) and work towards a method to allow uploads from the audience who want to add user-generated data. Fort Awesome will narrow its focus on these Secret City features for the purposes of the midterm and we seek to have these two APIs implemented by then.
This week, Railway to Heaven has entered into diversified production mode in preparation for its beta website launch next week. Following research and group discussion the team has decided to alter its website proposal slightly. Instead of examining a single route between Los Angeles and the Bay Area, the website will showcase built environment, transportation, and housing data for the HSR route connecting Los Angeles and Anaheim.
The RtH team undertook three tasks this week. The Chief Creative Honcho applied his aesthetic acumen to researching cascading style sheets and identifying attractive, user friendly designs that will best structure RtH’s website and represent its work. RtH’s resident TASER undertook the task of creating a generic maproom that will serve as the foundation of the team’s new site. As a final piece of the group’s preparatory activities, the PMM created a .kmz file with the proposed HSR alignments throughout California, including latitude and longitude points for all stations. The team anticipates final formatting activities and addition of 3 APIs to take place this week.
Next week is our midterm. Captain Planit has developed a plan of action that will ensure each individual members’ talents is utilized to the fullest capabilities. As a team, we have come up with a list of tasks we want done by next weekend and have split the work up fairly.
We plan on having the rough campus map, floor plans for Luskin building, and Flickr/Twitter API connected. We are individually working on our assigned tasks. For the most part, that involves cleaning up our files and preparing to make them into KMZ files via Google Earth.
The APIs are not up yet, but the Flickr tutorial last week helps a lot. I hope that Twitter does not cause many problems. I think a good source for help with the Twitter API will be the site that Yoh made for the Japan earthquake/tsunami…. or Yoh himself.