Friday, April 24, 2009

Branching off renamed trunk

Recently I got asked a small but unobvious branching question. Suppose you have a folder named FolderName, and for some reason you have renamed it to NewFolderName. All is well, but now you decided you want to create a branch from that folder, and to branch from the version prior to renaming.

Due to the reasons detailed in my older post, you will not be able to use branching UI for the operation. The only way to achieve that is to use tf command-line client branch command where you will explicitly specify version you branch from and the folder name at that revision

tf branch /Project/FolderName /Project/Branch /version:C123

Typical mistake people make is to use current item name, NewFolderName instead of the name that existed in the past(i.e. FolderName at the time of changeset 123).

Mirrored from MSDN blog

Saturday, April 11, 2009

Work Item customization tidbits: estimating the effort (part 12 of X)

My apologies for a long silence on the subject of the series (due to several recent events) but hopefully now am back on the track and I have a long back log :)

In my previous posts I have discussed various bits that are important know before taking on Work Item types customization. Today I’d like to talk more about approaching the whole process.

I would like to advocate a conservative approach, since in most organizations (at least in my experience) there are limited resources dedicated to customization, users support and maintenance of Work Item types.

The easiest way to jump-start the customization process is to use one of the existing templates; I’d recommend stock templates coming with TFS (MSF or CMMI); however, nowadays there are other decent templates available (for example, Conchango Scrum is well known and widely used). At the very least, that provides you with minimum work item logic implemented in professional manner.

To understand the customization effort required, it is helpful to review the following:

1. Detail new data fields to be added to those existing in Work Item type; note if existing fields rules need to be modified

2. Identify the work item state lifecycle desired and how it compares with the existing one for Work Item type (mostly paying attention to the flow rather than to the states names).

3. For your new custom fields, see if there is any special logic to be implemented viz. 

  • Whether field rules are to be scoped by user/group
  • Whether field rules are to be scoped for different states
  • Whether field needs to be associated with static/dynamic list of values

Once you create mapping table of the desired vs. existing fields, these data may be used to estimate the complexity of the development & maintenance. I have tried to compile (somewhat biased) complexity list of elementary field customization task (ordered by the simplest to the most demanding):

i. New data field. Simplest customization possible both from the point of implementation and subsequent maintenance. May require additional effort if the field is to be reported on (since the integration into reports will be required)

ii. Data field with lists of values (local or global lists). For static (i.e. rarely updated) lists of values (such as priorities), both the implementation and maintenance are fairly simple. However, if the lists content is dynamic (such as customers), make sure you plan for maintenance and, more importantly, do not make any assumption as to list content in fields’ rules.

iii. Data field with static rules logic (no dependency on state/user). Since rules implemented may be pretty complex, the scenario is as complex as you made it from the point of implementation complexity. And depending on how well you test the implementation, maintenance may range from nightmare to none.

vi. Data field with rules logic dependent on state transitions. When rules are defined include dependency states lifecycle, that generally means that you need to put extra effort into testing (for large states chart the effort may be very significant) and require regression testing when the state lifecycle is modified.

v. Data field with logic dependent on user/group. When rules are scoped to specific groups (rarely to users in corporate environment), the complexity of environment may have a bearing on WIT. Namely, in Active Directory environment with multiple levels of inclusion between groups it might not be easy to diagnose why your rules function incorrectly (either from the point of being too loose or too restrictive). Extra maintenance may well be expected.

vi. Data field with custom controls. When your data field in addition to rules expressed in WI Type definition has logic defined in custom control assembly, you have just added extra dimension on implementation, testing and maintenance. That becomes even more complex task if the custom control should work for Web interface

Once you identified the work to be executed, you will be able to plan effort required for implementation, testing, deployment and maintenance.

In conclusion, I’d like to highlight two very important principles which when followed will prevent a plethora of issues: a) never deploy to production before deploying to test environment and b) plan and execute the whole WI Types customization process as if it was an ordinary software development effort.  

Related posts:
- Work Item Customization: customization and global lists (part 11)
- Work Item Customization: customization process (part 10)
- Work Item Customization: customization tools (part 9)
- Work Item Customization: special fields (part 8)
- Work Item Customization: fields maintenance (part 7)
- Work Item Customization: global lists (part 6)
- Work Item Customization: system fields (part 5)
- Work Item Customization: user interface (part 4)
- Work Item Customization: state transitions (part 3)
- Work Item Customization: conditional field behavior (part 2)
- Work Item Customization: fields definition (part 1)

Mirror from MSDN blog

Wednesday, April 01, 2009

TFS Administrator chores – space offender strikes again!

In my previous post I talked about management of large files in TFS version control database. Today I’d like to talk about what you can do to optimize space management in work item tracking database.

As you know, it is possible to add file attachments to Work Item, with the maximum attachment size of 2Mb (by default); but most people who use attachments with WI change that limit to something larger (this MSDN article details how to change the maximum attachment size), since default frequently does not suffice for video captures and such.

Which naturally brings us to the question – if the maximum size set, say, to 32 Mb, how could one prevent misuse of the attachment feature?

There is nothing in Team Explorer UI to help you with figuring out the size of the added attachment; and nothing to prevent a user from adding however many large attachments (if they are not greater than maximum size). That leaves you with user education as a form of prevention; and to report the usage it is possible to run raw SQL on the relational database (all of the below queries are strictly AS IS etc.):

-- Query WIT database
USE TfsWorkItemTracking;
    -- parent work item 
    ID AS WorkItemID, 
    -- name of the attachment file
    OriginalName AS AttachementName, 
    -- attachment comment 
    -- file size
    [Length] AS [Size], 
    -- whether attachment was deleted
    CASE WHEN RemovedDate = '01/01/9999' THEN 0 
              ELSE 1 END AS Deleted 
FROM WorkItemFiles    
    -- File attachments only
    FldID = 50
    -- return only large files
    AND    [Length] > @LargeFile 

The query will give you the list of WI with large attachments, so you could figure out whether this feature is used in a sensible way.

If you look at the query closely, you’ll notice that the attachment in the database can be removed from WI and still exist in the database. What does that mean, say you? Whereas with version control one can delete item (where the item still will be in DB) and then destroy it (where item will be purged from DB), there is no such feature with Work Item attachments.

It turns out when you delete attachment from Work Item, the actual content is never deleted from database unless you do it manually. There is even helpful but incredibly well-hidden and vague article in MSDN on the subject, titled “How to: Delete Orphaned Files Permanently”.

That means even if you have managed to delete large attachments from WI, your job to recover the space is still half-done, and you need to actually delete the attachment content from the database.

The query below will enumerate all orphaned (deleted from Work Items, but still in DB) attachments, whereas subsequent query can be used to actually purge the deleted items from the database.

-- Query for all orphaned attachments
SELECT WorkItems.ID AS WorkItemID, 
        WorkItems.OriginalName AS AttachementName,
FROM TfsWorkItemTrackingAttachments.dbo.Attachments Attachements, 
        TfsWorkItemTracking.dbo.WorkItemFiles WorkItems
    WHERE Attachements.FileGuid = WorkItems.FilePath 
        AND WorkItems.RemovedDate <> '01/01/9999'
        AND WorkItems.FldID = 50

-- When absolutely sure - delete the orphans
    FROM TfsWorkItemTrackingAttachments.dbo.Attachments
-- join to WIT tables to identify orphans
WHERE FileGuid IN (SELECT FilePath 
        FROM TfsWorkItemTracking.dbo.WorkItemFiles
        WHERE RemovedDate <> '01/01/9999'
        AND FldID = 50)

Purging orphans seems to me a good candidate for the recurring job (not sure why it is not part of core TFS setup).

Mirrored from MSDN blog