Recently people have released tools to make it easy to find something to map on the OpenStreetMap (OSM) project. If you’re looking for something to do take a look at one of these tools:
The Humanitarian OpenStreetMap Team (HOT) focuses on areas in the world that aren’t mapped well (or at all) and where there is a need for a humanitarian effort. If you’ve been around OSM for a while you’ll probably remember reading something about the mapping effort in Haiti after the earthquake back in 2010. The humanitarian effort didn’t stop there. Most recently Typhoon Haiyan has left many international emergency responders in need of mapping data to be able to move their resources around and perform damage assessments. Their projects are listed on their website and allow you to checkout a task for mapping or validating other mapping work. As work is completed it goes into the hands of the responders on the ground.
Okay, this is a neat idea. Whenever edits or massive data entries are done there are times when everything doesn’t mesh well. Maybe a road crosses another road which looks fine when viewed on a screen but breaks navigation and leads to inefficient routes. These bugs can go unnoticed unless someone tries to use the data and finds a problem or someone might happen to stumble upon the issue. That’s where Maproulette comes in. This system gets programmed with a specific problem that needs to be fixed. Right now that problem is the connectivity issue I mentioned. It combs through the OSM database looking for just these issues and records where they are located. A visitor to the site is shown the next error in the list with an offer to bring up an editor so they can either fix the problem or verify a false positive. Because the edits are quick to fix I’ve been able to knock out forty or more in an hour without much thought. (Hey, some people knit and watch TV, I fix mapping bugs and watch TV!)
Oh yes, the Battle Grid! Sounds similar to a cage match doesn’t it? When the 2012 TIGER mapping data from the US Government was released it was discovered that the quality of the data was much better than the TIGER maps that many of the United States’ roadways were originally based upon in OSM. Battle Grid (from Maproulette) combs through the OSM database and compares the TIGER 2012 data to what’s in OSM and then shows discrepancies as colored blocks. The redder the boxes the more discrepancies there are in that grid. So a quick look at the map shows you where your mapping skills are most needed. Some additional tools have been created that allow you to overlay the TIGER 2012 data under the OSM data for comparison. This allows you to visually see the problem areas and fix them quickly.
There are other ways to find data that might be missing. Just walking around your community and visually comparing the OSM data to what you see with your own eyes certainly helps. Validating data against satellite imagery is also good. As I wrote about in an early post, collecting points of interest and address information is also quite useful while out and about. Getting involved in OSM is easy and learning how to map isn’t difficult either. So get out there and get mapping!
Okay, I admit it, I’m a closet cartographer. There are few things that excite me like looking at, building, and working with maps. Luckily for me the OpenStreetMap (OSM) Project was born and I started contributing back in 2008. Back when I was starting contributing I was making minor changes to the TIGER map data, cleaning up the bad data that peppered my local town. Today I’m still cleaning up data but I’m also adding points of interest (POIs), such as restaurants, shops, and hotels, and also address information that makes the overall data more useful to consumers.
The tools used to edit and collect mapping data have improved over the past few months with many applications coming to the Android operating system. The physical size of many Android devices allows field collection of data without having to lug around a laptop. With many of the features now available on the portable platform, collecting mapping data is easier than ever.
Tools of the Trade
Still the workhorse tool of my contributions, I use the Java OpenStreetMap Editor (JOSM) for most of my edits on the project. Whether I’m using GPX files of trails and roads collected from the field or adding POIs and other map features from satellite imagery, JOSM makes it easy to make advanced additions and changes. Many mapping programs use JOSM as a springboard for their data to get into the OpenStreetMap repositories. If you are serious about working with OSM data then you should get comfortable with using JOSM.
OsmAnd Maps and Navigation, an Android, is usually marketed as a program for viewing OSM data and using it for navigating from one point to another. This program allows you to download, directly to your device, the mapping data which is quite helpful if you don’t have an Internet connection to get this data like other mapping solutions.
From a contributor’s point of view, OsmAnd allows you to create GPX track files that can be later edited with JOSM and also allows you to create, and upload directly to OSM, POIs such at restaurants that you may be visiting at the time. This is a great feature for me as I will sometimes find myself somewhere that isn’t officially on the map.
Keypad-Mapper 3 is an Android application that allows easy mapping of house numbers. Using JOSM or Potlatch 2, the online OSM editor, Keypad-Mapper data can be imported, verified, and then uploaded into the OSM repositories.
Other software is available to collect, modify, and use OSM mapping data. If maps interest you or if you are just looking for a good, open source mapping solution take a look at OSM and enjoy the large amount of global work that goes into the project every day.
I’ve just built the latest version of CQRLOG, version 1.6.1, for Fedora 18 through 21. The packages are being pushed to the updates-testing repos now and should be available soon. If you use CQRLOG in Fedora from the repositories I’d appreciate you testing this latest build and giving karma if it works (or doesn’t work) for you.
This update provides the following enhancements and bugfixes:
- 630M band added
- added OQRS (online QSL request system) to QSL sent menu
- added “Always sort by QSO date” option to Search function
- cursor is moved to last opened log in DB connection window
- “Ask before creating a backup” option to “Auto backup” added
- band map is much faster, a few optimization added
- program freezed for a few milliseconds with every bandmap refresh – fixed
- “MySQL server has gone away” problem fixed
- membership values collation were case sensitive – fixed
- ADIF import sometimes crashed with access vioalation, now will show what happened
- qrz search with right click on a call in the recent QSOs list didn’t work
- band map font settings was not loaded when program started
I’ve been arguing with my web hosting company about their use of RC4. Like many enterprise networks they aren’t consistent across all their servers with respect to available ciphers and such. It appears that all customer servers support TLS_RSA_WITH_CAMELLIA_256_CBC_SHA and TLS_RSA_WITH_CAMELLIA_128_CBC_SHA, in addition to TLS_RSA_WITH_RC4_128_SHA (although the latter is preferred over the other two) but their backend controlling web servers only support RC4. This is a problem if you are handling crypto (keys) (and other settings) over a weak encryption path to better secure your web service as you have essentially failed due to using the weak encryption to begin with.
So what’s wrong with RC4?
It’s been known for a while (years!) that RC4 is not a good encryption cipher. It’s broken and there are several attacks that are available. So why is it being used so frequently? In a word: BEAST. RC4 was the only stream cipher available that can combat BEAST and so it became the standard for all TLS connections. It’s not clear which attack vector is worse: BEAST or the weak RC4.
In recent months most Internet browsers have implemented the workaround n/n-1 to fix the BEAST vulnerability. With the fix in place it should, once again, be safe to use block ciphers and, thus, get better encryption ciphers (better protection). There have been many people and organizations talking about the need to get rid of RC4 now since it is a bigger threat to web security. Yesterday Microsoft released a security bulletin discussing the problem and urged all developers to stop using RC4. (Oh yeah, and they also want to stop using SHA-1 as well.) I usually think of Microsoft as trailing in the security field (lets face it, their products aren’t known for being secure ever since that whole network thing happened) so when they say that this mess with RC4 must stop it’s gotten to a point where we should have already done so.
So what are we waiting for?
I think, simply, we’re waiting for TLSv1.1 and TLSv1.2 to become mainstream. It’s not as if these technologies have just popped up on our radar screens, however, (they’ve been out since April 2006 and August 2008, respectfully) but there has been slow adoption of the two flavors of TLS. According to Microsoft, their products are ready for TLSv1.1 and TLSv1.2 (both IIS on and IE 11+). Firefox supports up to TLSv1.2 in 25.0 but you have to manually turn it on (it’s for testing) and OpenSSL (used for Apache) should support TLSv1.2 in its 1.0.1e release. It’s time to start pushing these better encryption mechanisms into operation… now.
Thought I’d pass along this research study, The keys to the kingdom, as I found it to be quite interesting (especially when you scan the entire Internet for your data). If you don’t understand the math explanation at the beginning just continue reading as you don’t need to have a degree in math and science to understand what’s going on.