Firefox: An unknown error occurred while printing 2

A while back, I had trouble printing from Firefox. I had an unreliable solution that worked for a lot of people a lot of the time. It didn't work all the time, though, and it didn't work for everyone.

The best solution is simply to uninstall Firefox, delete your Firefox profile, and reinstall Firefox. Sync your bookmarks ahead of time with Google Toolbar/Google Bookmarks or FoxMarks if you don't want to lose them. I was initially too stubborn to clean up and reinstall Firefox, but after having done it, everything works better. I am a happier man and a better citizen for having done it. Give it a try.


MobileCampBoston2 notes

These are my notes from two sessions I attended on Saturday at MobiCamp Boston 2.

Building Android apps

Android apps can run in background
Linux-based security
App is Java/J2ME + lots of extra classes from Google
Activity: your app's basic view, e.g. HelloStart
All package names start with "android"
android.location.LocationListener: for location services
DDMS: your good debugging interface
... and I spent the rest of the hour downloading and configuring Eclipse and the Androd SDK, and then writing Hello World and running it in the G1 emulator.

Selling on the app store: the first 6 months
JJ Rohrer from Elegant Technologies, LLC

He built a bunch of apps, many of which were derived from each other. He tracked the tweaks he made to each app and charted the impact of each change.
No intentional marketing--just a bunch of experiments

JJ's experiments:
  • Front page (launch day): great sales numbers
  • Most apps have asymptotic decline after launch day (Elegant Words)
  • Upgrade: sales spike, but not as high as launch day (Would it be a good idea to publish upgrades as new apps? Because launch day spike exceeds upgrade day spike.) (Fresh apps sell better.) (publish upgrades frequently to get the repeated sales spikes) (Updates list is separate from new releases list)
  • Professional redesign: sales spike, but not as high as first upgrade
  • Upgrade adding skinning: sales spike, but maybe not related to skinning
  • Refer a Friend yields more purchases than in-app catalog of other apps, but hardly anyone uses refer a friend anyway (2% tap the button)
  • Web app staff pick: great usage spike
  • Category doesn't seem to make a difference
  • Free lite version of app: didn't lead to purchases of full version, probably cannibalized sales of full version
  • Android: Clone PowerNap for Android. 1000 downloads per day for free on Android store. Attempted to sell through Handango--got only 2 sales at $1.99. Relaunched on android store as pay app; can't discover it in app store, even via search.
  • Localize product description but not app (e.g., sell to Japanese consumers): no sales impact
  • Localize app (currency): no sales impact
  • Advertising
    • Facebook: cheap, no sales impact
    • LinkedIn: expensive, no sales impact
    • Google: targeted, no sales impact
  • Apple's changing guidelnes and enforcement
  • Search doesn't necesarily return his apps
Category, rank, free count, paid count (sales per day)
  • education, top 1, , 100
  • lifestyle, top 50, 300, 25
  • games, top 1, , 10000

android vs iphone
platform, free count, paid count (sales per day)
  • iphone, 125, 2
  • android, 1000, 0.25
Localytics: cross platform analytics

Use Google Alerts to discover who is saying what about you.

Pirates: maybe 30% of server hits are from pirates

  • Distribution trumps all. Top 100 or featured item.
  • App is useful
  • Personalization
  • Being on pp store isn't enough on its own
  • Get as many Day 1 reviews as possible
  • Launching early on platform is more profitable


Agile for mobile

Agile for mobile

These are my notes from a presentation I gave Saturday at MobiCamp Boston 2.

The pitch
  • ... or maybe it's in the product development, too
  • Software engineering is where you spend most of your effort, time, and money when you build your mobile app
  • Scrum can help control your software engineering effort/time/expense
  • I'll share some of my team's practices
Why Agile
  • Not because
    • It's trendy
    • It's cool
    • You think it allows you to be lazy
    • You don't like process
    • (because actually is a strict process, and it doesn't allow you to be ad hoc or lazy)
  • Because software engineering is the most expensive, most time consuming, least predictable part of product development
  • Need a way to mitigate the expense, control the cost, decrease risk, and get product into market faster
  • How: Eliminate Waste
    • Don't do things that don't have high business value
    • Don't make development team switch context frequently
    • Communicate
      • Frequently
      • High bandwidth
      • High signal:noise ratio
      • Know what to work on, what not to work on
Why Scrum in particular
  • Not XP
    • XP is great, but the practices don't scale well to the management level
    • XP is too micro-level, with individual practices like pair programming and unit testing
  • Scrum is a full framework for agile management and product development
  • Scrum framework is easier to communicate to management.
Scrum heritage
  • Backlog
  • Sprints
    • Goal: produce a unit of potentially releasable software
      • With no new technical debt
    • 2 weeks long
      • Long enough to get stuff done
      • Short enough so it feels urgent and so you can frequent feedback from customer
      • 10 work days: division by 10 is easy
    • Sprint planning meeting
      • Estimate stories
        • Estimate all unestimated stories in release backlog
        • Planning Poker
          • Avoids anchoring
          • Fun and accurate
          • Quick
      • Select, commit
        • Select highest business value items from backlog
        • Publicly, to each other
        • Based on team's known velocity
          • Review team's historic velocity to make sure commitment is reasonable.
        • Velocity per team, per day, per team member
          • Typically 4-6 hours per person per day
        • Team, not management, makes commitment
        • People tend to keep commitments that they make for themselves and that they make publicly.
      • Keep commitment stable for sprint duration
        • Don't switch context too frequently--it wastes development time (Eliminate Waste)
        • Scrum Master protects the team
    • Sprint review meeting
      • Demo
      • Retrospective
        • Team determines how to increase velocity (Eliminate Waste)
      • Gauge sprint velocity
        • Able to predict final done date after a few sprints
        • Release backlog / team velocity => # of sprints left
    • Deliver to customer on last day of each sprint
  • Burndown chart
      • Sprint burndown
      • Project burndown
      • How many hours did you commit do getting done? How many hours are left?
        • Offers clues to whether your sprint is on track.
        • Slope of graph compared to ideal linear slope
        • How close to 0 work left?
        • What sprint day is it?
        • Are you on track?
      • Able to make adjustments based on actual progress data
        • Add people
        • Remove committed items
        • Warn product owner, customer
      • Burndown hours/story points when story is Done
      • What is Done? What does Done mean?
      • Publicly track actual progress against commitments
    • Daily scrum
      • What did you get done yesterday? What do you plan to get done today? What are the impediments to getting things done?
      • Publicly track actual progress against commitments
      • Look for impediments that prevent team from meeting commitments
      • Scrum Master removes impediments
      • Communicate
    • Weekly scrum-of-scrums
      • With each customer
      • With senior management
Agile != lazy or ad hoc
  • Deliberate
  • Surprisingly detail oriented
  • Lots of communication through the Scrum rituals (burndown chart, daily scrum, sprint planning and review meetings)
  • You are not excused from excellent up front project planning
  • You are excused from anything that doesn't help the team get closer to being Done (Eliminate Waste!)
  • Fixed fee project difficult to reconcile with Agile.
    • When will it be done?
    • How much will it cost?
  • Estimation
    • Can't estimate stories/requirements you don't know about
    • Can't properly estimate stories that change
  • Difficult to get people to prioritize backlog
  • Eliminate waste
    • Don't work on stuff that has low biz value
    • Save, money
    • Increase velocity
      • Velocity is a vector. The direction is always "toward the most business value."
  • Backlog grooming forces people to state and agree on priorities
    • Painful but must be done
  • Able to accurately predict done date
    • After a few sprints
    • If you know team's velocity and size of backlog, you can predict done date
  • Able to tell customer not to add or change requirements
    • Easy to show customer how adding to backlog delays done date
  • Happier dev team, customer, management


Mobile success factors: how to succeed, how to fail

These are my notes from a presentation I gave Saturday at MobiCamp Boston 2.

The pitch

  • You are building a mobile app
  • You want it to be successful
  • How do you do that?
  • Is there something in product development that you can control? Of course, but we all know how to build a great app, so that doesn't distinguish you.
  • So it's all about the marketing, right?
  • I surveyed six mobile apps to find out why some of them are successes and other are not.
  • What are the key mobile success factors that will help you win?
  • Sit in on my talk to hear about the survey and find out why some apps succeed and some don't.
  • We can all build excellent mobile apps
  • What it is that makes successful mobile apps?
  • If you were to build a new mobile app, what kind of app would be your best bet? What kind of marketing would be your best bet?
  • Be patient--the answers come later in the talk.
  • Disclaimer
    • This is early research.
    • Think of it as the seed of some useful information.
    • Needs to be tuned and improved
  • Given two apps that are almost exactly the same, why is one successful but the other is not?
  • How can all the apps we build be successful?
  • Background reading
    • Theory of Constraints
      • A friend gave me the book The Goal: A Process of Ongoing Improvement by Eli Goldratt
      • A business novel. Instead of a dry how to guide, it is a dry novel, telling the story of a business team running a division of a failing company and how they fix it, make it succeed.
      • Got excited about it, read the two follow up books
    • Scrum
      • ToC, Toyota production system tuned for software engineering (really for any business)
    • Toyota production system
      • Eliminate waste
      • For software, see Scrum: don't do things that have low business value
  • ToC process
    • Identify your goal
    • Identify the biggest constraint on achieving your goal
      • Globally, through the whole system, not just in your area of responsibility (for me, this means not just in software engineering)
    • Fix that constraint
    • Repeat
  • ToC in practice, for mobile apps
    • Identify your goal
      • Make money by building successful mobile apps
    • Identify the biggest constraints on achieving your goal
      • Product development
        • How can I build more apps faster?
      • Marketing
        • How can I sell more apps after they are built?
    • Prioritize
    • Fix that constraint
    • Repeat
Product development
  • Product life span
    • Product development is most expensive, most time consuming, riskiest part
    • Large revenue opportunity at end of product development
  • Definition: everything that goes into building an app suitable for sale to consumers
  • UX design
  • Software engineering
    • Most expensive, most time consuming, riskiest part
    • Address via Scrum
      • Scrum is like Theory of Constraints tuned for Software Engineering. When applied well here, Scrum mitigates this constraint, but doesn't eliminate it.
  • Product life span
    • Many smaller revenue opportunities beginning at product launch
  • Definition: everything that goes into selling the app to consumers and making money on it
  • Given two apps that are almost exactly the same (Music1 and Music2), why is one successful but the other is not?
    • Same product development work, so eliminate product development as biggest constraint
    • It must be the marketing--our biggest constraint on success

  • Identify mobile apps to evaluate
    • Identified six mobile apps
    • Due to nondisclosure agreements, cannot name apps, who built them, or for whom they were built
    • Things these apps have in common
      • All apps are currently available to consumers
      • All apps have a monthly subscription revenue model, not one time purchase
      • All are data connected. Getting and storing data OTA on a remote server
    • None are iPhone App Store apps
    • Some are downloadable app's, some are preloaded on phones, some are web apps
  • Gauge each app's perceived level of success
    • Ask producer, sponsor: is this app successful?
      • Keep it simple: binary answer, Yes or No
    • Qualitative, not numeric. "I know it when I see it."
    • Adequate for this study; people have internalized goals for the apps and have no hesitation communicating whether an app is successful or not
    • 3 successful, 2 not successful, 1 not sure
  • Enumerate obvious characteristics/dimensions
    • 22 characteristics/dimensions to help understand why some apps succeed and other don't
    • Mostly marketing oriented, but some product development oriented
    • State each characteristic as a predicate, e.g. Is it a web app? Is it inexpensive? Does it include streaming video?
      • Keep it simple: binary answer, Yes or No
    • Try to construct the predicate such that the answer should be positive for a successful app
  • For each characteristic/predicate/dimension
    • Answer the predicate for each app: Yes or No.
      • Predicate nature keeps it simple.
    • Derive a success correlation score
      • The probability that a successful app has this characteristic
      • If you have this characteristic, are you likely to succeed?
      • The number of Yes answers divided the number of successful apps
    • Derive a failure correlation score
      • The probability that an unsuccessful app lacks this characteristic
      • If you lack this characteristic, are you likely to fail?
      • The number of No answers divided by the number of unsuccessful apps
    • Combine success score and failure score into a weighted score
      • weighted-success-correlation = success-correlation * num-successful-apps
      • weighted-failure-correlation = failure-correlation * num-unsuccessful-apps
      • weighted-score = (weighted-success-correlation + weighted-failure-correlation) / num-apps
  • Sort dimensions by combined weighted success score
    • High score => correlates to success
    • Low score => does not correlate to success
  • If combined weighted success score is >0.50, then the dimension is a success factor
  • In general, ignore the "not sure" app. It is not obviously either a success or a failure, so its results don't help.
Results: top success factors
  • High: combined weighted correlation score == 1.00
    • Acceptable number of subscribers
      • A proxy for "acceptable monthly revenue"
      • "Acceptable" different from "high". Depends on app's target number of subscribers or target revenue.
      • A new app with 1,000 subscription events per month might be acceptable. An app that had high product development cost and has 100,000 subscription events per month might be unacceptable.
      • Based on your criteria, if the app's number of subscribers meets its target, whatever that target is, then the app is successful.
    • Inexpensive (Retail price less than $2.75 per month)
      • Consumers will buy it if it's inexpensive.
      • $2.50 per month is an attractive price point.
      • Two of the successful apps are quasi-free: included for free
        • In monthly membership of a related service, or
        • In the price of the carrier's data plan.
      • Unsuccessful apps are as high as $5 per month
    • Web app
      • Two of the successful apps are web only
      • One of the successful apps is both web and download/preload
      • Web apps have a low barrier to use compared to download apps
        • There is nothing to download--just launch it in your mobile web browser
      • Successful apps are either
        • Easy to find on carrier deck, or
        • Have easy way to send URL to your phone via SMS
      • Easy to adapt web app to large number of devices
  • Next best success factors: combined weighted correlation score == 0.80
    • Small carrier size
      • Small carrier == has less then 50million subscribers
      • Two of the successful apps are on a small carrier deck
      • Both of the unsuccessful apps are on a large carrier deck
    • Few competing apps on deck / in app store
      • Two of the successful apps are one-of-a-kind on the carrier deck
      • The two unsuccessful apps have many competitors
      • May be a corollary to "Small carrier size"
      • Do app vendors perceive "large carrier" as "large potential market of consumers"? Are too many vendors attracted to large carriers, leading to too much app competition?
    • Long term deck / app store promotion
      • Two of the successful apps have been long term featured items on carrier's deck
      • The two unsuccessful apps have not been featured items for more than one or two weeks at a time.
    • Carrier branding
      • Two of the successful apps are the "CarrierX Special Cool App", extending the carrier's brand name to the product category
      • Neither unsuccessful app is carrier branded.
      • Another corollary to "small carrier size" and "long term deck / app store promotion"? Does carrier prefer to promote it because of the branding?
Results: bottom success factors (non-success factors)
  • Product category
    • Not a numeric success score
    • The factor that got me started on this: if two apps are virtually identical, but one is successful and the other isn't, why?
    • Survey uses category names from iPhone App Store. 2 Music apps, 2 Entertainment apps, 1 Lifestyle, 1 Sports
    • One Music app is successful, the other isn't
    • One Entertainment app is successful, the other isn't
    • Product category not a predictor of success.
  • Low: combined weighted correlation score == 0.20
    • Product >12 months on market
      • Product age does not correlate to success
      • One successful app is one month old; another is 24 months old
    • Short term deck / app store promotions
      • Similar to, but not the exact complement of "long term deck / app store promotion"
      • Occasional short term promotions do not yield success
    • Short term mobile web advertising
      • Targeted to supported devices
      • Occasional short term advertising does not yield success
      • No apps had long term mobile web advertising, so don't know whether that would be a success factor
    • Famous brand
      • Only one of the successful apps has a famous brand name
      • Both unsuccessful apps have a famous brand name
    • Downloadable app
      • Similar to "web app"
      • Downloading is a barrier to installation?
      • More difficult to adapt app to a large number of devices
        • Device fragmentation: Java, BREW, iPhone, Android, screen sizes, etc.
    • Streaming video
      • Only one successful app is video oriented
      • Both unsuccessful apps contain stream video
  • If you were to bet on the success of a new mobile app, what should you look for?
  • Bet on
    • Inexpensive (Retail price less than $2.75 per month)
    • Web app
      • Or at least a web alternative to a download app
      • E.g. Remember the Milk
    • Few competing apps on deck / in app store
    • Long term deck / app store promotion
  • Don't bet on
    • Downloadable app
    • Streaming video
    • Short term promotions or advertising
    • Famous brand
  • It's only six apps with a specific revenue model (monthly subscription)--may not apply to you
Next steps
  • Identify additional apps to evaluate
    • Including single-purchase apps
    • Global survey?
  • Devise a quantitative measure of app success
    • Better than "I know it when I see it."
  • Identify additional dimensions (including more non-marketing factors), evaluate apps in those dimensions
    • Free trial
  • Strengthen statistical model
    • Although these initial results pass the stink test
      • Stink Test: If it smells rotten, it probably is.
      • These results don't smell rotten.
  • Grow this seed into a useful tree.


Done with Jira

Great--now the team has a rough agreement on what Done means.

Not great--Jira disagrees with us.

Here's the problem. Semantically, we divide our work tasks into two categories: Not Done and Done. Jira semantically divides issues into statuses "open" and "resolved"; syntactically, Jira groups issues with statuses {Open, In Progress, Reopened} and {Resolved, Closed}. In Jira, an issue is "open" if its status is in the set
{Open, In Progress, Reopened}
and an issue is "resolved" if its status is in the set
{Resolved, Closed}
Work tasks that are Not Done are the tasks that are in our sprint backlog and that appear as unspent hours on our sprint burndown chart. We want Jira's default filters and reports to tell us about these issues--the ones that are Not Done (not Resolved or Closed)--but the way developers use Jira, Resolved doesn't mean the same thing as Done.

In its default configuration, Jira is a code writer's tool. Its default filters and reports reflect the code writer's perspective of task completion: if the developer wrote the code, his unit tests passed, and he committed the code, he marks the issue Resolved/Fixed. When he marks an issue Resolved, it is no longer "open" and it disappears from the default filters and reports. Examples of these filters and reports are the issue counters on the Browse Project page, the Components and Versions filters, and the Version Workload Report. That Jira's Resolved is not the same as the team's definition of Done makes it difficult to use Jira to track which issues are Done and which are Not Done; it is difficult to use Jira to track sprint backlog and burndown. Jira's Resolved is not equivalent to the team's definition of Done.

Yet Jira is an excellent tool for tracking issues and organizing work tasks. We use Jira to organize our product backlog, release backlog, and sprint backlog. Because we rely on Jira, we need the default Versions filters to show us which issues are Not Done yet in each backlog. We need the default Version Workload Report to show us how much effort estimate remains in our sprint backlog, so we can keep our sprint burndown chart up to date. How can we reconcile Jira's focus on the individual developer with our need for a team view of Done?

We solve the problem by creating a custom Scrum workflow. Our Scrum workflow has a new state, In Verification & Validation. Semantically, this state is a different kind of "open" or Not Done. Now, an issue is "open" if its status is in the set
{Open, In Progress, In Verification & Validation, Reopened}
We add transitions to and from In Verification & Validation to support the new workflow. When a developer thinks he completed his part of an issue, he clicks Ready for Verification to mark it In Verification & Validation, and the rest of the team knows it is fair game for testing. We mark an issue Resolved/Fixed only when the team as a whole thinks the issue is Done.

Team members can now communicate different kinds of Not Done to each other through Jira, and Jira's default filters and reports still work. The Components and Versions filters correctly show the issues that are Not Done, and the Version Workload Report accurately shows remaining burndown. Problem solved, and Jira is our friend again.


Manage by commitment + manage by process == manage by Scrum

Donald Sull recently discussed three styles of management, with management by commitments as the winner, and management by process the runner up. I agree: these are key aspects of Scrum, and two of the reasons Scrum works.

Sull's first style management is managing by power hierarchy. This is the old command-and-control style of management. This style just doesn't work in innovation industries. When you are building something new and complex, you don't know exactly what you are building early enough to know what to command and how to control it. You can't know everything to do ahead of time, and you can't control every behavior and task outcome. You delude yourself by inventing fictional project plans, full of deceptively credible looking work breakdowns, effort estimates, and Gantt charts. You get upset when your team veers from your prescribed work sequence, and your team is frustrated that it can't do things "the right way." You all lie to each other and your upper management team that the project is making adequate progress, and then it all hits the fan near your delivery deadline.

The second management style Sull discusses is management by process. Sull criticizes Six Sigman, TQM, and other overly rigid processes as innovation blockers. Scrum responds to this by being light weight. If the Scrum team identifies a practice that impedes its progress and acceleration, then the team addresses at the daily scrum, the sprint retrospective, or just by working closely together and communicating frequently. Scrum's lightweight framework stays out of the way at the right times, and encourages practical improvements as needed.

Sull's third style is management by commitment. He identifies five characteristics of good commitments: public, active, voluntary, explicit, and motivating. Good commitments are made and tracked publicly. The person making the commitment takes an active role in making sure he understands what to do. People make commitments voluntarily, with the ability to say no. The commitment has explicit actors--a person fulfilling the commitment, and person benefiting from the commitment. Finally, every commitment has a motivating rationale.

Sull doesn't say it, but the combination of managing by process and by commitment is Scrum. Scrum is a lightweight process framework, providing enough process to be effective without being stifling. A Scrum team makes its commitments publicly and publishes them on the project task board (see http://www.mountaingoatsoftware.com/task-boards and http://www.scrumalliance.org/articles/39-glossary-of-scrum-terms). Every task that is committed trumps any task that the team hasn't committed to. Team members face each other at least once a day at the daily scrum, where they report their progress against their commitments. The team as a whole tracks its progress against its commitments daily and at sprint boundaries. The team voluntarily makes its sprint commitments as it selects items from backlog for the current sprint. The actors are explicitly identified: members of the team do the work for the product owner. The rationale for doing each task is clear, expressed by the user stories and by the business value and prioritization associated with each story, and further explained and amplified by the product owner as needed.

Scrum teams are successful for the same reasons teams that use process and commitment are successful. Scrum succeeds by trusting people to make and meet commitments in a can-do atmosphere.


Is it done?

The question is a cliche among agile teams: what does Done mean? I recently heard a number of responses from two teams:
  • It works so well we are willing to give it to the customer. I like this definition. These team members are proud of their work and are focused on pleasing the customer.
  • The pilot has landed the airplane safely. The metaphor is that we are delivering a service, a trip on an airplane. It is an attractive metaphor because of the extremeness of the Not Done case: the plane crashes and people die. The metaphor becomes squirrelly, though. Is it Done if the entertainment video screens were broken? Is it Done if there were no snacks and drinks on the flight? The team agrees that this metaphor doesn't work.
  • The code compiles, you can install the app, you can launch the app, and you can exit the app, but there are many bugs left to fix. This team member decomposes the realization of a story into Code, Test, and Debug; he is thinking Waterfall rather than Agile. He focuses on the Code segment, and thinks he is Done when he is done with the initial coding. He is working in isolation, not well integrated with the team. He misses the point that the story is not Done until the team decides it is Done.
  • The app is ready for production deployment. This is another good definition, similar to the first one. This team member is focused on the customer and end users. The team is not Done until it can publish the app on production servers and share it with the world.
Done is an important concept. When we plan sprints and releases, we estimate stories and tasks in terms of how long it will take for the team to be Done. Discussing and defining Done is a valuable exercise and conversation.


iPhone Dvorak for real

Update: See Read More on Dvorak for iPhone for more!

It's here, and it's the real deal: a Dvorak keyboard layout for your iPhone. Unlock your iPhone, run Cydia, and install the Dvorak Keyboard addon for iKeyEx. To activate the Dvorak layout, tap the globe key until you see the familiar AOEUI key sequence. As a bonus, you get punctuation keys like ',.?- on the primary keyboard; you don't have to tap the 123 key for punctuation, as you do on the iPhone's Qwerty keyboard.



Related Posts with Thumbnails