Android Saga: Pull to Refresh

06/25/2012 § Leave a comment


Pull to refresh is an extremely popular UI gesture (that Loren Brichter pioneered in Tweetie) used in a lot of iOS apps. Basically it makes refreshing part of the scroll gesture itself.

The iOS community have had library support for this for a long time – I think that Three20, back in 2010, was the first library to offer this feature out of the box.

This very feature has come to the Android world later on and today many apps use it.

Since I don’t have much experience with Android, the first thing I did was to search for an open source implementation. I found plenty of them in a couple seconds.

(I also found a lot of Android developers thanking the Android open source eco-system for making libraries available as if that wasn’t true for other platforms. iOS as many other mobile platforms counts with a strong open source community too ^^)

The best implementation I found out there was a contribution from Johan Nilsson. Besides the sluggish scrolling and lack of animations, it works pretty well (correct me if I am wrong, but we can’t blame Johan for half the issues here since Android makes it really difficult to have smooth scrolling and bounce animations on ListViews).

I didn’t have any problems to import his library and use his implementation. Specially because he provides a very straightforward demo. The only thing that was really upsetting me though, was that pull to refresh header is supposed to be hidden all the time when the user isn’t scrolling or refreshing the list. And the header was standing there presenting a “Tap to Refresh” message.

That was when I decided to look the code and fix it. To be honest, the idea behind Johan’s implementation is very similar to the iOS approach. But not quite.

Let me tell you why.

On iOS, Pull To Refresh is implemented by the following steps:

1) Create the header view which displays the arrow, the spinner and the text messages
2) Add it to the UIScrollView with a negavite y value
3) Implement the UIScrollView’s delegate to set the contentInset as the header height (therefore making it visible) and ask the controller to refresh the content
4) Once the controller is done, it sets the contentInset to zero (therefore hiding the header view)

On Android, all the implementations I found follow the steps below:

1) Create the header view which displays the arrow, the spinner and the text messages
2) Extend the ListView and implement onAttachedToWindow, setAdapter, onTouchEvent to select the first row (WTF!?)
3) Implement the onScrollListener (equivalent to the UIScrollView’s delegate)
4) Once the controller is done, select the first row so the list scrolls to the top and hides the header if the content is higher then the list view

Although the approaches are very similar, Android’s version is essentially a hack that exploits the optional header in standard ListViews. When the list is displayed, it scrolls to the first item on the list, effectively hiding the header. When the list is short enough to be displayed entirely on screen, no scrolling is necessary, hence the “Tap to Refresh” button is always visible!

After an hour, I didn’t find a way to fix the issue of keeping the header hidden since hiding it would not make it disappear. That is when I came to a StackOverflow post  that basically told me to put the header in a LinearLayout that wraps it’s content, and hide the content so the wrapping LinearLayout collapses when its content is hidden, resulting in the header view being 0dip high and therefore invisible.

I find the solution very bizarre, but it worked out and that is what I am using right now. The only thing that still upsets me, is that the feature is not polished enough. It is still sluggish to pull, flicks once in a while, and lacks animation.

I will get back to this post once I figure out a better solution, if any. I am counting on you – Android devs – to help me out  😉

EDIT:

Found a library that provides a very good user experience. Full story here.

Android Saga: Applying an image as the background pattern of a ListView

06/23/2012 § 2 Comments


This is the first (I hope) of many episodes of an epic saga where an iOS developer (me) adventures himself in the (so far painful) Android developers world.

Before we get started, let’s get synced. Long before mankind (back in 2007), when Android was on a very early stage, I got all fired up and started one (if not THE) first Android community in Brazil. It turns out I ended up working with iOS on the next year (which won my heart) and since then I have not write a single line of Android code. So, today I am nothing but an iOS developer struggling to build an Android application that actually feels as good as an iOS app.

We all know iOS has such appealing features that Android still lacks, like smooth scrolling apps, list views that are able to nicely bounce, a simulator that is a piece of cake and system upgrades ;P

But I am not going to write bad stuff about Android trying to make iOS shine. I am actually following an advice of a friend of mine (idevzilla). After hearing me mumbling for a couple days, he got tired and told me to go share the experience of coming out from the iOS platform and diving into the Android SDK.

Personally I hope Android experts will come to me and show me better ways of writing Android code so that I can get the user experience I want for my Android app.

Since every awesome app is made of tiny little details, let’s get started with one: apply an image as the background pattern of a ListView (equivalent to UITableView).

Let’s start with Android.

After an hour researching on how to do this, I found out that Android requires the creation of a XML file that describes a bitmap which the source is the image I want to use as the pattern and the tile mode is repeat. This file must be placed under a drawable folder.

res/drawable/app_background.xml

<?xml version="1.0" encoding="utf-8"?>
<bitmap xmlns:android="http://schemas.android.com/apk/res/android"
 android:src="@drawable/background_pattern"
 android:tileMode="repeat" />

Then we need to create a set of styles for the ListView that use this bitmap and tell the app it should use those styles.

Running the app I saw that nice background. Then I scrolled down the list and it became black. I lifted my finger and then the background was there again (WTF!?).

It happens by default an Android View has a transparent background and transparency involves a lot of calculation when rendering your application on screen. In order to avoid blending and make the rendering faster while scrolling, the ListView widget uses a cache mechanism. This mechanism consists on “turning opaque the view hierarchy”. So I researched a little more until I found out that I needed to set the cache color hint to “transparent”. This solves the issue but also disables the optimization we just discussed.

res/values/styles.xml

<?xml version="1.0" encoding="utf-8"?>
<resources>
<style name="app_theme" parent="android:Theme">
 <item name="android:windowBackground">@drawable/app_background</item>
 <item name="android:listViewStyle">@style/TransparentListView</item>
 <item name="android:expandableListViewStyle">@style/TransparentExpandableListView</item>
</style>

<style name="TransparentListView" parent="@android:style/Widget.ListView">
 <item name="android:cacheColorHint">@android:color/transparent</item>
</style>

<style name="TransparentExpandableListView" parent="@android:style/Widget.ExpandableListView">
 <item name="android:cacheColorHint">@android:color/transparent</item>
</style>
</resources>

AndroidManifest.xml

<application
 android:icon="@drawable/ic_launcher"
 android:label="@string/app_name"
 android:theme="@style/app_theme">

Okay. This works out. BUT…I want the background to scroll together with the ListView so the user feels he is actually moving the list down or up.

Unfortunately the only way I found to achieve this goal is to forget about that styles.xml file and undo the changes to the AndroidManifest.xml file. Instead, we need to create a layout file for the list item (equivalent to UITableViewCell) and add the following to the root Layout element:

 android:cacheColorHint="@android:color/transparent"
 android:background="@drawable/app_background"

This however will not disable any highlights caused by the list selectors…but this post already got too long.

Let’s ignore this issue for a moment and see how iOS handles this.

self.tableView.backgroundColor = [UIColor colorWithPatternImage:[UIImage imageNamed:@"background-pattern.png"]];

That is all for today.

Tracing routes with MapKit

05/22/2012 § 23 Comments


Presenting a map to the user is a common feature of mobile apps. And very often this feature comes with an additional requirement: to trace the route from the current user location to some arbitrary destination. The thing is, most apps accomplish this last requirement by adding a button to the right navigation item that opens up google maps on the browser. But usually this is not the best user experience.

Most developers don’t know this (and I was one of them not too long ago), but it is possible to use the MKMapView to easily render paths between to locations. There isn’t however (for now) any native APIs that magically handle this kind of drawing.

iOS handles routes using MKOverlay objects (just like it handles pins using MKAnnotation). There is a native MKOverlay class called MKPolyline which consists of an array of CLLocationCoordinate2D structures that MKMapView knows how to draw.

The thing is: We know only two locations (coordinates). The current one (our origin) and the place’s location (the destination). AND we need all the coordinates in between these two end locations describing a smooth path following the roads and streets considering traffic and so on, in order to properly create the MKPolyline object and add that to the map.

This is where Google Directions API comes in. Google offers an API (both JSON and XML) that among other options let’s you specify two locations and returns a complex set of information containing all sorts of data, like routes (with alternatives), waypoints, distance and directions (instructions). At first, you might look to the documentation and think that you may need to write a parser, iterate through the structure and grab what you need. That is exactly what you need to do, but not as difficult as it seems. The information we are looking for is available as a string named overview_polyline available under the route tag. Just grab that.

If you are using JSON (the recommended output), there are a lot of third-party libraries out there that represents a JSON string as native data structures such as NSArray, NSDictionary and NSString. Now if you are really lazy (and smart), then you use some sort of library like AFNetworking to handle requests and get for free JSON parsing right on the response callback.

Almost every step of the process is a piece of cake until here. The MapKit has a native overlay view that knows how to display a route. The route is given to you for free and with almost no efforts by Google and AFNetworking provides you automatic parsing of the response Google sent you.

The only remaining detail is: Google Directions API gives us a string representing the route and we need an array of CLLocationCoordinate2D structures.

Fortunately the Encoded Polyline Algorithm Format used by google is fully described in the docs and an Objective-C implementation was made available by Ankit Srivastava on stackoverflow.

For those lazy guys who are in a hurry, good news: There is a code snippet below for every point of our discussion.

(WordPress sucks when it comes to presenting source code, but there is a “View Source” button that lets you copy the code and properly paste it! But just in case you wish to read the code I have also attached a file here 😉

  • Create the Map View
_mapView = [[MKMapView alloc] initWithFrame:self.view.bounds];
_mapView.showsUserLocation = YES;
_mapView.delegate = self;
[self.view addSubview:_mapView];
  • Once you have the current location, define the map region you want to be visible:
MKCoordinateRegion viewRegion = MKCoordinateRegionMakeWithDistance(self. location.coordinate, REGION_SIZE, REGION_SIZE);
MKCoordinateRegion adjustedRegion = [_mapView regionThatFits:viewRegion]; [_mapView setRegion:adjustedRegion animated:NO];
  • Also request Google Directions API to retrieve the route:

AFHTTPClient *_httpClient = [AFHTTPClient clientWithBaseURL:[NSURL URLWithString:@"http://maps.googleapis.com/"]];
[_httpClient registerHTTPOperationClass: [AFJSONRequestOperation class]];

NSMutableDictionary *parameters = [[NSMutableDictionary alloc] init];
[parameters setObject:[NSString stringWithFormat:@"%f,%f", location.coordinate.latitude, location.coordinate.longitude] forKey:@"origin"];
[parameters setObject:[NSString stringWithFormat:@"%f,%f", endLocation.coordinate.latitude, endLocation.coordinate.longitude] forKey:@"destination"];
[parameters setObject:@"true" forKey:@"sensor"];

NSMutableURLRequest *request = [_httpClient requestWithMethod:@"GET" path: @"maps/api/directions/json" parameters:parameters];
request.cachePolicy = NSURLRequestReloadIgnoringLocalCacheData;

AFHTTPRequestOperation *operation = [_httpClient HTTPRequestOperationWithRequest:request success:^(AFHTTPRequestOperation *operation, id response) {
	NSInteger statusCode = operation.response.statusCode;
	if (statusCode == 200) {
	 [self parseResponse:response];

	} else {

	}
} failure:^(AFHTTPRequestOperation *operation, NSError *error) { }];

[_httpClient enqueueHTTPRequestOperation:operation];

  • Get what you need:
- (void)parseResponse:(NSDictionary *)response {
 NSArray *routes = [response objectForKey:@"routes"];
 NSDictionary *route = [routes lastObject];
 if (route) {
 NSString *overviewPolyline = [[route objectForKey: @"overview_polyline"] objectForKey:@"points"];
 _path = [self decodePolyLine:overviewPolyline];
 }
}
  • And use the code provided by Ankit Srivastava:
-(NSMutableArray *)decodePolyLine:(NSString *)encodedStr {
 NSMutableString *encoded = [[NSMutableString alloc] initWithCapacity:[encodedStr length]];
 [encoded appendString:encodedStr];
 [encoded replaceOccurrencesOfString:@"\\\\" withString:@"\\"
 options:NSLiteralSearch
 range:NSMakeRange(0, [encoded length])];
 NSInteger len = [encoded length];
 NSInteger index = 0;
 NSMutableArray *array = [[NSMutableArray alloc] init];
 NSInteger lat=0;
 NSInteger lng=0;
 while (index < len) {
 NSInteger b;
 NSInteger shift = 0;
 NSInteger result = 0;
 do {
 b = [encoded characterAtIndex:index++] - 63;
 result |= (b & 0x1f) << shift;
 shift += 5;
 } while (b >= 0x20);
 NSInteger dlat = ((result & 1) ? ~(result >> 1) : (result >> 1));
 lat += dlat;
 shift = 0;
 result = 0;
 do {
 b = [encoded characterAtIndex:index++] - 63;
 result |= (b & 0x1f) << shift;
 shift += 5;
 } while (b >= 0x20);
 NSInteger dlng = ((result & 1) ? ~(result >> 1) : (result >> 1));
 lng += dlng;
 NSNumber *latitude = [[NSNumber alloc] initWithFloat:lat * 1e-5];
 NSNumber *longitude = [[NSNumber alloc] initWithFloat:lng * 1e-5];

CLLocation *location = [[CLLocation alloc] initWithLatitude:[latitude floatValue] longitude:[longitude floatValue]];
 [array addObject:location];
 }

return array;
}
  • Create the MKPolyline annotation:
NSInteger numberOfSteps = _path.count;

CLLocationCoordinate2D coordinates[numberOfSteps];
for (NSInteger index = 0; index < numberOfSteps; index++) {
 CLLocation *location = [_path objectAtIndex:index];
 CLLocationCoordinate2D coordinate = location.coordinate;

 coordinates[index] = coordinate;
}

MKPolyline *polyLine = [MKPolyline polylineWithCoordinates:coordinates count:numberOfSteps];
[_mapView addOverlay:polyLine];
  • And make it visible on the map view:
- (MKOverlayView *)mapView:(MKMapView *)mapView viewForOverlay:(id <MKOverlay>)overlay {
 MKPolylineView *polylineView = [[MKPolylineView alloc] initWithPolyline:overlay];
 polylineView.strokeColor = [UIColor redColor];
 polylineView.lineWidth = 1.0;

 return polylineView;
}

Please note the code snippets provided on this post doesn’t have any error handling neither are optimized. Remember to fix these issues before copying them to your application.

Natural Code Just Works.

01/13/2012 § 2 Comments


As soon as I heard about ARC I thought “WOW, this is amazing! A compiler-time garbage collector! Why didn’t anyone think about this before!?”.

But then even after migrating to iOS 5 I got a little scared about changing the compiler and the whole memory management schema that I have been using my entire life.   Actually I was waiting to being able to work on a new project so that I could start using these new concepts. It turns out I couldn’t wait anymore and decided to migrate this huge project I am working on to ARC. Not only for the promise of “running faster”, but also for education (in the end I love to explore and learn new ways of writing code).

After migrating the source code by using the automatic migration tool provided by Xcode 4.2.1 (and set the Simulator as deployment target hahahah) I was immediately able to see that natural code just works. That is the way Apple wants us to think about using ARC. And my impression tells me this is totally possible.

But I am a man full of questions about the meaning of life and all this crap, so I couldn’t just believe on that and started to watch some ARC talks Apple has done to comprehend how this magic works behind the scenes. Truth is I can’t live with something I don’t understand when it comes to coding.

Although ARC is pretty simple, here are some annotations I have made that really helped me to understand “da magic”.

First of all there are 5 things you cannot forget:

1) Strong references. Every variable is a strong reference and is implicit released after it’s scope ends. A strong reference is the same thing as a retained reference that you don’t manage. For example:

When you declare NSString *name; the compiler understands you actually meant __strong NSString *name;. And this means you don’t need to retain the reference nor release it afterwards anymore.

- (id)init
self = [super init]; 
if (self) { 
name = [@"Name" retain]; 
}

return self;

}

- (void)dealloc
[name release]; 
[super dealloc];

}

becomes

- (id)init
self = [super init]; 
if (self) { 
name = @"Name"
return self;

}

2) Autoreleasing References. Every out-parameter is already retained and autoreleased for you.

- (void)method: (NSObject **)param { *param = …; } 

means

- (void)method: (__autoreleasing NSObject **)param { 

*param = … retain] autorelease];

}

3) Unsafe references. If you see this, keep in mind you are working with a non-initialized, no-extra compiler logic and no restrictions variable. An unsafe reference tells ARC not to touch it and as a result what you get is the same as an assign property. The advantage here is: you can use this inside structs. But be warned this can easily dangling references.

 __unsafe_unretained NSString *name = name; 

4) Weak References. Works like an assign property, but becomes nil as soon as the object starts deallocation.

 __weak NSString *name = name; 

If you want to create a reference weak, just add __weak before the variable declaration or weak to the property instead of the old assign parameter.

5) Return Values. They never transfer ownership (ARC does a retain and returns a autoreleased object for you) unless the selector starts with alloc, copy, init, mutableCopy or new. In these cases ARC returns a +1 reference (for you), which you also don’t need to bother with on the caller side due to the rules we discussed above.

Now that you know how ARC works and what it does, you can write natural code in peace =)

Multiple Video Playback on iOS

01/11/2012 § 44 Comments


As usual let me start by telling you a quick story on how this post came to be…

Once upon a time..NOT. So, I was working on this project for Hyundai at nKey when we got into this screen that requires two videos playing at the same time so the user could see how a car would behave with and without a feature (like Electronic Stability Control). As an experienced developer I immediately told the customer we should merge both videos so that we could play “both at the same time” in iOS. I explained him that in order to play videos on iOS, Apple has released a long time ago the MediaPlayer.framework which according to the docs (and real life :P) isn’t able to handle more than one video playback at a time (although you can have two MPMoviePlayerController instances).

He said OK and that was what we did. However the requirements changed and we had to add a background video that plays on a loop..aaand I started to have problems coordinating all these video playbacks so that only one was playing at a time and the user wouldn’t notice it.

Fortunately nKey sent me to the iOS Tech Talk that was happening on São Paulo/BR this very monday and there I got into a talk where the Media Technologies Evangelist, Eryk Vershen was discussing the AVFoundation.framework and how it is used by the MediaPlayer.framework (aka MPMoviePlayerController) for video playback. After the talk during Wine and Cheese I got to speak to Eryk about my issue and explained him how I was thinking about handling the problem. His answer was something like “Sure! Go for it! iOS surely is capable of playing multiple videos at the same time!…Oh…and I think something around 4 is the limit”. That answer made me happy and curious so I asked him why is the MediaPlayer.framework incapable of handling multiple video playback if that wasn’t a library limitation…he told me the MPMoviePlayerController was created to present cut-scenes on games early on…that this is why on previous iOS versions only fullscreen playback was allowed and that this limitation is a matter of legacy.

When I got back to my notebook I worked on this very basic version of a video player using the AVFoundation.framework (which obviously I made more robust when I got back to the office so that we could use it on the project).

Okdq, story told. Let’s get back to work!

The AVFoundation framework provides you the AVPlayer object to implement controllers and user interfaces for single- or multiple-item playback. The visual results generated by the AVPlayer object can be displayed  in a CoreAnimation layer of class AVPlayerLayer. In AVFoundation timed audiovisual media such as videos and sounds are represented by an AVAsset object. According to the docs, each asset contains a collection of tracks that are intended to be presented or processed together, each of a uniform media type, including but not limited to audio, video, text, closed captions, and subtitles. Due to the nature of timed audiovisual media, upon successful initialization of an asset some or all of the values for its keys may not be immediately available. In order to avoid blocking the main thread, you can register your interest in particular keys and become notified when their values are available.

Having this in mind, subclass UIViewController and name this class VideoPlayerViewController. Just like the MPMoviePlayerController, let’s add a NSURL property that tells us from where we should grab our video. Like described above, add the following code to load the AVAsset once the URL is set.

 #pragma mark - Public methods  
 - (void)setURL:(NSURL*)URL {
      [_URL release];
      _URL = [URL copy];
      AVURLAsset *asset = [AVURLAsset URLAssetWithURL:_URL options:nil];
      NSArray *requestedKeys = [NSArray arrayWithObjects:kTracksKey,
                kPlayableKey, nil];
      [asset loadValuesAsynchronouslyForKeys:requestedKeys
                           completionHandler: ^{ dispatch_async(
                                    dispatch_get_main_queue(), ^{
                                 [self prepareToPlayAsset:asset
 withKeys:requestedKeys];
                           });
      }];
} 
- (NSURL*)URL {
      return _URL;
}

So, once the URL for the video is set we create an asset to inspect the resource referenced by the given URL and asynchronously load up the values for the asset keys “tracks” and “playable”. At loading completion we can operate on the AVPlayer on the main queue (the main queue is used to naturally ensure safe access to a player’s nonatomic properties while dynamic changes in playback state may be reported).

 #pragma mark - Private methods 
- (void)prepareToPlayAsset: (AVURLAsset *)asset withKeys
 (NSArray *)requestedKeys { 
     for (NSString *thisKey in requestedKeys) { 
        NSError *error = nil;  
        AVKeyValueStatus keyStatus = [asset  
             statusOfValueForKey:thisKey
                           error:&error];  
        if (keyStatus == AVKeyValueStatusFailed) {  
           return;
        } 
     }
 if (!asset.playable) {
           return;
      }
      if (self.playerItem) {
          [self.playerItem removeObserver:self forKeyPath:kStatusKey];
          [[NSNotificationCenter defaultCenter] removeObserver:self 
                   name:AVPlayerItemDidPlayToEndTimeNotification
                 object:self.playerItem];
      }
      self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
      [self.playerItem addObserver:self forKeyPath:kStatusKey 
              options:NSKeyValueObservingOptionInitial |
                      NSKeyValueObservingOptionNew
              context:
           AVPlayerDemoPlaybackViewControllerStatusObservationContext];
      if (![self player]) {
            [self setPlayer:[AVPlayer playerWithPlayerItem:self.playerItem]];
            [self.player addObserver:self forKeyPath:kCurrentItemKey 
                  options:NSKeyValueObservingOptionInitial |
                          NSKeyValueObservingOptionNew
                  context:
             AVPlayerDemoPlaybackViewControllerCurrentItemObservationContext];
      }
      if (self.player.currentItem != self.playerItem) {
             [[self player] replaceCurrentItemWithPlayerItem:self.playerItem];
      }
}

At the completion of the loading of the values for all keys on the asset that we require, we check whether loading was successfull and whether the asset is playable. If so, we set up an AVPlayerItem (representation of the presentation state of an asset that’s played by an AVPlayer object) and an AVPlayer to play the asset. Note that I didn’t add any error handling at this point. Here we should probably create a delegate and let the view controller or whoever is using your player to decide what is the best way to handle the possible errors.

Also we added some key-value observers so that we are notified when our view should be tied to the player and when the the AVPlayerItem is ready to play.

#pragma mark - Key Valye Observing

- (void)observeValueForKeyPath: (NSString*) path
                      ofObject: (id)object
                        change: (NSDictionary*)change
                       context: (void*)context {
	if (context == AVPlayerDemoPlaybackViewControllerStatusObservation
             Context) {
              AVPlayerStatus status = [[change objectForKey:
                NSKeyValueChangeNewKey] integerValue];
              if (status == AVPlayerStatusReadyToPlay) {
                   [self.player play];
              }
	} else if (context == AVPlayerDemoPlaybackViewControllerCurrentItem
             ObservationContext) {
              AVPlayerItem *newPlayerItem = [change objectForKey:
                 NSKeyValueChangeNewKey];

              if (newPlayerItem) {
                  [self.playerView setPlayer:self.player];
                  [self.playerView setVideoFillMode:
                      AVLayerVideoGravityResizeAspect];
              }
	} else {
		[super observeValueForKeyPath:path ofObject: object
                    change:change context:context];
	}
}

Once the AVPlayerItem is on place we are free to attach the AVPlayer to the player layer that displays visual output. We also make sure to preserve the video’s aspect ratio and fit the video within the layer’s bounds.

As soon as the AVPlayer is ready, we command it to play! iOS does the hard work 🙂

As I mentioned earlier, to play the visual component of an asset, you need a view containing an AVPlayerLayer layer to which the output of an AVPlayer object can be directed. This is how you subclass a UIView to meet the requirements:

@implementation VideoPlayerView

+ (Class)layerClass {
	return [AVPlayerLayer class];
}

- (AVPlayer*)player {
	return [(AVPlayerLayer*)[self layer] player];
}

- (void)setPlayer: (AVPlayer*)player {
	[(AVPlayerLayer*)[self layer] setPlayer:player];
}

- (void)setVideoFillMode: (NSString *)fillMode {
	AVPlayerLayer *playerLayer = (AVPlayerLayer*)[self layer];
	playerLayer.videoGravity = fillMode;
}

@end

And this is it!

Of course I didn’t add all the necessary code for building and running the project but I wouldn’t let you down! Go to GitHub and download the full source code!

3D Tag Cloud available on GitHub!

11/26/2011 § Leave a comment


Hey guys!

This time I bring very good news! The 3D Tag Cloud I created almost a year ago is finally a free software available on GitHub.

Yes, that is right. A lot of people is asking for some code sample after reading that tutorial so I decided to just make it available on GitHub as a free software under the terms of GNU General Public License version 3, so that you guys can use, redistribute or modify it at will.

Now it is your turn! Contribute!

SECCOM 2011: Introduction to iPhone Development

10/17/2011 § Leave a comment


Hey!

So, this year is the second time I am talking about cool stuff at SECCOM. Only this time it is about even cooler stuff: iPhone!

SECCOM is a great event that becomes more amazing every year. Actually it is a week where a bunch of people comes to UFSC (Universidade Federal de Santa Catarina) to share ideas regarding technology. This year it counts with discussions regarding quality of software, embedded software, databases, arduino, android, windows phone 7, iPhone and much more.

My talk is about iPhone Development, actually it is more like an introduction. The idea is to tell people how they can get started into the iOS world. I am covering very basic topics like what is the cost, which are the tools and what iPhone is all about. Of course it wouldn’t be a good talk for beginners if it doesn’t had a cool demo. So the demo I chose shows how to do a very basic use of UITableView and the ZBar library for scanning QR codes.

If you watched the presentation and wants the material or couldn’t go, but would like it as well, I have good news: It is all available here. Oh, and one more thing: Since the talk is happing in Brazil the presentation is in Portuguese, but I took the care to translate it to English so that you guys could also enjoy!

Thank you!