Monday, December 6, 2010

One day to Dreamforce; my chatter entry gets some airtime!

It's 1 day to Dreamforce and I made it to the US this morning after 12hrs from Brisbane. Some fellow DF'ers accompanying me on the flight which was cool - even from Brisbane Australia there's Dreamforce excitement!

Grabbed a quick look at Ben Kepes's blog entry on Dreamforce preview on Friday evening and was pleasantly surprised to see my Chatter dev challenge app TwitCh mentioned, along with my video write-up of what it does - thanks Ben for the mention! I am aiming to get even more features in to a V2 which I'm going to do as part of the Hackathon tomorrow at Cloudstock. 12+ hours on the plane over gae me a bit of time to do a list of user stories that I think will make TwitCh even more usable, and if anyone is keen I will aim to put it on Codeshare afterwards.

Apart from that, can't wait to see what Cloudstock has in store for us - so many tech companies to get round in one day! Looking forward to a jet-lag busting rest this afternoon, in preparation for the rest of the week. Look forward to seeing you all from tomorrow...

Friday, December 3, 2010

Force.com Workbench v20 is out! Get it working on Mac OS X

I am a big fan of Workbench, an awesome PHP app that's open source on google code, and I am even more happy now that v20 has been released! I saw on the code site that there's only been 238 downloads – if you haven't used this before, you need to take a look! Since starting to use it I now make more use of Workbench than I do of Data Loader – not that Data Loader doesn't have its place, but Workbench is a lot more flexible in its access and display of data to humans.

For the non- or only vaguely-techies among you, a PHP app requires a web server to run on, so Workbench is a little more involved than just downloading install.exe, but it's well worth the adventure. I have a Mac and I thought I'd post a step by step of how to get this working – believe me, it's worth your while (and Friday afternoon) to get this up and running. If you're on a PC then google for 'force.com workbench PC installation' and I'm sure you'll find similar posts.

1. Get yourself a web server. Apache is perfect; even better, it comes nicely set up and bundled as part of XAMPP. Download the XAMPP dmg from SourceForge and install it into /Applications

The good part about installing this packaged version of Apache is that it already takes care of a number of extensions that you would otherwise need to worry about – like the SOAP and SSL extensions. Happy days!

2. XAMPP should have installed itself under /Applications/XAMPP. Go and find this directory, and within it there should be a link to htdocs. This is your web server's root directory.

3. Download v20 of Workbench from the google code site here. I've downloaded the stable v20.0.0 but you could try the beta version also.

4. Unpack the contents to a folder called 'workbench' or similar under your htdocs directory

5. Run the XAMPP Control utility (in the base XAMPP directory) and then click on the start button for Apache. Note you'll need to be an admin user to do this; you'll likely get prompted if you're not.

6. XAMPP starts up the Apache web server on port 80 by default, so in a browser navigate to http://localhost/workbench. You should see the default login page for Workbench.

If you have a direct connection to the internet then you're good to go; you should be able to enter your username and password from here and sign in! Use the advanced mode if you want to sign into a sandbox org, or change the default login URL.

Also note that you may need to append your security token to the end of the password if you're outside your org's trusted IP range.

If you're using a proxy server then there are a few additional steps to perform, so read on...

Proxy setup

If you're on a Mac that's behind a firewall then you'll need to configure proxy settings.

1.First you need to make sure that the proxy options are enabled in the config. Go to htdocs/workbench and open config.php

2.Search for the string 'header_proxyOptions' and you should come across an array of settings. Notice that in this array, the value for the key of 'display' is 'false'. Change this to 'true'

3.You also need to ensure each of the proxy options is overrideable. Immediately below the 'header_proxyOptions' array are 5 more arrays: proxyEnabled, proxyHost, proxyPort, proxyUsername and proxyPassword. For each of these arrays, change the value for the key of 'overrideable' from 'false' to 'true'

4.Save your config.php and refresh the login page. Then hover your mouse over the Workbench image and you'll see a drop down menu. Navigate to Settings

5.Under these settings, you should have the proxy options. Check the 'proxy enabled' checkbox and add details of the host and port, and username/password if applicable.

6.Once you Apply these settings then your connection should be good to go!

General settings

Even if you don't need to enable proxy settings, you should still take a look at the Settings menu, by hovering over the Workbench image on the login page and navigating to Settings. These are the usual options for default batch sizes, assignment rules, and other options similar to the Data Loader settings.

Other PHP settings

XAMPP comes with a reasonable set of PHP initialisation parameters, for memory size etc, so for the most part your workbench should perform well even with large files for your DML operations. If you find you are hitting limits though, try increasing some of the following parameters, found in your PHP.INI file under /etc in your XAMPP base directory.

max_execution_time = 30     ; Maximum execution time of each script, in seconds
max_input_time = 60    ; Maximum amount of time each script may spend parsing request data
memory_limit = 128M      ; Maximum amount of memory a script may consume (8MB)


upload_max_filesize = 128M

Using the Bulk API

If you are keen to use the bulk API for asynchronous processing of really large datasets, then you'll need to enable cURL. Because the bulk API uses REST as opposed to SOAP for its application protocol, you need to ensure cURL extension is enabled in PHP.

To do this, go to your PHP.INI file and search for 'curl'. You should find a commented line in the extensions area of the file:

;extension=curl.so
(NB. Don't uncomment the line extension=php_curl.dll as that's for Windows only)

Remove the ; to uncomment the line, and save your PHP.INI file, then reboot Apache. That should be it! Note again that if you're using a proxy server then you'll need to play around with the proxy settings for cURL. This means playing around with the curl_init() call in the PHP script. Check out my post on the developer forums here for more details.

Have fun with Workbench!

Tuesday, November 9, 2010

Number of business days between two dates

Well it's been a few months since I've posted, mainly down to a change of role - I cut down on my remote working (day after day in the 2nd bedroom was getting a bit tedious) and ended up working in Sydney for a finance company, who are doing a lot of cool force.com stuff. Has been great getting back into the full-scale dev work, and I've had a chance to work with some technologies that I hadn't really used in any great depth - one being Hudson which is ultra-cool when combined with the force.com migration tool. This has therefore inspired me to get back into blogging and given me some nice starter topics... More on those in the coming weeks.

Thought I'd start off with a pretty simple post though, to make sure I remember how ;)

A lot of the stuff I've been doing is around transaction clearing, where SLA's are important based on the number of business days to settle a transaction. Cases obviously have the concept of 'age' but this is the only place that I could find in salesforce.com where this is automatically handled, based on the current date vs the date opened. But I needed to calculate this on a custom object, and not always for the current date (ie. some times between 2 fixed dates).

So I came up with the following formula field to determine the number of business days. It's pretty straight forward, and I'm mostly writing this to document it for my own use in the future. But it might come in handy for you also.

First I need to ensure the two dates I'm comparing do in fact fall on a business day (M-F). So there's a simple MOD call that will determine this.

Field: Process_Date_Working_Date__c
Formula: CASE( MOD(Process_Date__c - DATE(2009,1,4),7), 0, Process_Date__c+1,1, Process_Date__c, 2, Process_Date__c, 3, Process_Date__c, 4, Process_Date__c, 5, Process_Date__c, 6, Process_Date__c+2,Process_Date__c ) 

Field: Current_Working_Date__c
Formula: CASE( MOD(TODAY() - DATE(2010,7,18),7),0,TODAY()-2,1, TODAY(), 2, TODAY(), 3, TODAY(), 4, TODAY(), 5, TODAY(), 6, TODAY()-1, TODAY() )

You can see I can play around with dealing with Sat or Sun - either default them back to the previous week or forward to the start of the next week. The current working date I push out to start of the next week; the process date (the earlier of the 2 dates) I also push to the start of the week after the Sat/Sun.

Then my actual business days formula (a number) is the following:

Field: Days_Unallocated__c
Formula: IF (ISPICKVAL( Status__c , "Allocated"), 0 ,
IF ( Current_Working_Date__c - Process_Date_Working_Date__c <= 4,
Current_Working_Date__c - Process_Date_Working_Date__c,
(
((Current_Working_Date__c - MOD(Current_Working_Date__c - DATE(2009,1,5),7)) -
(Process_Date_Working_Date__c - (MOD(Process_Date_Working_Date__c - DATE(2009,1,5),7))) ) /7
)*5
+ MOD(Current_Working_Date__c - DATE(2009,1,5),7)
- MOD(Process_Date_Working_Date__c - DATE(2009,1,5),7)
)
)

This formula is a little more tricky; there are 3 scenarios:
  1. If the status is Allocated, it's 0
  2. If the difference between the 2 dates is up to 4 days, it means they can be calculated as a simple difference
  3. If the difference is >4, then the calculation is harder - I have to calculate the number of weeks (* 5 for 5 working days per week) between the 2 dates, and then deal with where in each week the 2 dates are, and either add to the start of the week or subtract from the end of the week.
It's not perfect - for a start it obviously doesn't handle holidays or anomalies, but it provides a good enough indication for the user as to how many working days the transaction is unallocated for.

Let me know if it helps you, or if you have any similar formulas.

Sunday, June 6, 2010

TwitCh - my Chatter developer challenge entry - plus OAuth and JQuery

Well I've finally got my entry in for the Chatter challenge, here's hoping that it stacks up against some of the others! Haven't got the official submitted entry page yet for voting, but you can check out an overview on Youtube if you're interested. There's also some pics I've put up on flickr.

And of course the 30 sec summary..
"TwitCh provides a central point to aggregate not only important information from Chatter, but also feeds from external accounts like Twitter or Facebook. Because collaboration in the enterprise isn't only limited to employees within an organisation, TwitCh enables the display of all social feed information that might be important to a business."

Pulling TwitCh together has been a great learning experience for me, and as per my previous post it's given me a chance to get up to speed on 2 areas I've been meaning to look into for a while - OAuth and JQuery.

OAuth

When I started thinking about the idea for TwitCh, the key initial idea was around pulling back information from Twitter and combining it with Chatter feed info in salesforce.com. I started by looking into the Twitter dev area, and soon found that OAuth would have to form a key component of the solution. Twitter still supports xAuth (user/password) but only up to June 30th. I could have done it this way in time for the Chatter challenge, but OAuth is a much more elegant solution in so many ways, so I went for the more complex but more satisfying solution at the same time!

Twitter uses OAuth 1.0A, whereas Facebook uses OAuth 2 (much easier), but the solution had to fit with both providers. So I charged into building an OAuth consumer library within Apex. I looked round but didn't find anything out there for oauth/Apex, so built an extendable set of libraries that could be used across any service provider. I'm sure with OAuth becoming more prevalent Salesforce will incorporate this nativately some time soon but until then, there's a library which I will put on codeshare for others to use...

I've created the following:
  • Base OAuth class which handles common properties (eg. access token, consumer key) and methods (eg. Authorise() ). This also handles common functions which are required in the OAuth process - eg. get the Epoch value for the current time, signing with HMAC-SHA1, etc
  • For each service provider, an extension to this base class should be created, where common services are implemented (eg. For Twitter, getHomeFeed(), UpdateStatus() ). This extension uses the base Authorise() and the properties, and exposes the provider-specific services
For my TwitCh app, I then encapsulated all of this into a higher level class - eg. TwitterAccount, FacebookAccount - so I could refer to the concept of an 'Account' in my app, and attach other requirements such as account-specific settings, feed colour, nickname etc

Here's a screenshot of the layout of the class in Eclipse - showing the Base class as well as the Twitter extension:





You can see for example that a Twitter account has parameters sich as User ID and Screen Name - these are captured in the extension rather than the base class. All that really exists in the base class are the requirements for authorisation and persistence (eg. tokens).

The only requirement on the salesforce object side is somewhere to store these tokens once they are returned - both at the initial request token stage (as we are then passed over from salesforce to the service provider for further authorisation) and then again once the access token has been returned. This is stored along with an account name and User lookup.

JQuery

After OAuth work in Apex, JQuery was a breeze to get into my Visualforce page. TwitCh is all just the 1 main Visualforce page, with components for settings and a separate controller. I wanted to factor in some nice usability, so used the following:
Was just a matter of including the CSS and JS from the first 2 in my page (in the case of the settings sections, I had to ensure the CSS was in the main page not the component) and using the styleClass= tag in my Visualforce tags.

I'm also grateful to tgerm.com for their work on the XML DOM that they've put on google code (https://code.google.com/p/apex-fast-xml-dom) - this saved me a LOT of XML parsing work on the responses.

Finally, if you're super-super keen then I've put some of my 'blueprints' up here on flickr also - these are some of the first notes I made when I had the idea for TwitCh and how the class structure might look. Seems like months ago but it was only about 3 or 4 weeks when I seriously decided to do this, so I for one am pleased I've been able to go from concept to product in about 25 days!

I will also come clean and say that while TwitCh posts status updates to Facebook fine, I didn't get a successful feed back where I could present this info in TwitCh. Facebook returns their info as JSON, and while there is a JSON parser out there (thanks Ron H - it's on codeshare) this unfortunately isn't up to parsing a 1000+ line feed from FB. Basically I hit the script limit of 200k lines pretty easily ;) So my idea for coming SFDC releases will be native JSON support! In the mean time though I'm currently looking into an external JSON to XML parser which could potentially run in google app engine - my pet project in the coming weeks!

Thanks for reading.

Wednesday, June 2, 2010

Cloudforce Sydney and Chatter

I'm sitting in my hotel room with a great view over Sydney city and Darling Harbour (http://twitpic.com/1t84re) courtesy of a room stuff-up and subsequent upgrade (thanks Oaks Goldsbrough). Am down here for Cloudforce down under which is tomorrow at the Convention Centre - while most of the messages have been conveyed in web conferences over the past few months (Chatter, Summer 10), it's always good to go, just to get a 'feel' for what's going on Salesforce-wise in Australia. By the sounds of it, and by the number they're expecting, quite a lot is happening - 1 of the partners said the last 'big' event here was almost 2 years ago, and the audience was several hundred, whereas now it's 1000+. So will be good to see who's doing what down here!

Am also trying like crazy to finish off my entry for the Chatter Developer Challenge! It's due next Mon, but discovered today that Summer '10 is going live this weekend, meaning my dev org will be down at a very inconvenient time! So am trying to get as much done before the weekend as I can over a 3G connection (note to my boss: outside of work hours, I can guarantee!). Why do hotels still persist in charging an arm and a leg for Wifi access - $20 here for 24hrs!

Anywy I digress. My Chatter challenge app is coming along pretty well. Of course I don't want to say too much ;) but I have been working with 2 pretty cool technologies - jQuery for some front-end stuff, and OAuth for back-end communications. Both I think are just coming into their own (although JQuery fanatics might disagree that it's already 'in its own') but both are also in my opinion massive game-changers for web apps. Working with them is really giving me a chance to understand the in's and out's, and especially how they relate to salesforce.com, but I think pretty soon they will be as common as good ol'  A HREFs were not so long ago.

Anyway back to it, but I'll write another post with details of my Chatter challenge app once it's submitted next Monday!

Thursday, May 13, 2010

The Salesforce Handbook and other 'on demand' musings

OK, so if you happen to be reading this, and you saw my previous post, my good intentions have gone somewhat awry. However my excuses are many and varied (and all true) - relocated from UK back to Australia, got married, oh and still have been working remotely for my employer back in the UK. Anyway... now that the wedding is out of the way, hopefully my time will be sufficiently recovered to pick up this project again...

I digress... main purpose of this post is to congratulate Wes on his project with Jeff Douglas - the Salesforce Handbook (official announcement here at their blog). Certainly looking forward to seeing how this comes along. I'm particularly lucky that Wes is a part of my dev team at Telegraph in the UK, where we manage to keep pretty busy with a lot of interesting stuff on the force.com platform. I've been involved with salesforce.com for almost 4 years, and have seen the functionality of the apps and the platform grow to a pretty amazing level, and I can't quite imagine where to start thinking about how to encapsulate the best parts of this into a book! So good on you Wes and Jeff for tackling such a weighty task.

I'm also equally impressed to see that they're going to use Lulu to self-publish the book. Both in paper and digital forms I might add. Think back even 4-5 years, did anyone even talk then of using an online service to help them publish their own book?? Just 1 copy of a book, printed on-demand? I think the answer is probably, no. We can now 'on demand' for ourselves not just software and services on the internet, but hard material goods. T-shirts, books, heck there's even a machine that can reproduce itself!

None of this may be amazing to you, but as someone who has been at the forefront of software on-demand, the notion that this can be applied to material goods still strikes me as pretty 'out there'. It just shows how our world is evolving, and that we probably can't imagine what comes next in the on-demand arena. I went to an author talk recently by Nick Horgan who recently released his book Three Small Suspects about growing up in NZ. The talk was great, hearing some of the excerpts from the book which reminded me of some of the escapades I got up to growing up there also (although not nearly as bad as him!). But he went on to talk about how, even though his publishing company was a relatively localised one in Australasia, that his book had magically appeared on amazon.com. He asked his brother in Canada to order one, and it duly arrived 3 days later, but neither Nick nor his publishing company had ever sent stock to amazon, nor had they even asked for actual paper copies of the book. No, when the order came through to amazon.com in the US, they printed the book, the cover, and bound the copy, and shipped it, all within 3 days. Nick got a hold of the book from his brother, and said it was virtually identical to his own one. If we only knew how much 'on-demanding' went on in this world already, I think we'd be amazed!

Back to the original point of the post - looking out for progress on the Salesforce Handbook guys!