Tuesday, November 28, 2006

Merging and resolving conflicts using API

Recently, there were several questions on MSDN forums how to perform merge and resolve conflicts using Version Control object model API. Insofar as online help for those API is very insufficient, I thought I'd share merge related gotchas that I have accumulated over the time.
I will try to have a look at somewhat typical scenario of merge automation (where all changes are merged with uniform conflict resolution algorithm).

First thing to do is to call Merge method of Workspace class:

    Workspace workspace = _serverVc.GetWorkspace("WORKSTATION",
    GetStatus status = workspace.Merge("$/Project/Ongoing/Solution",

Let’s have a look at Merge and its parameters (I will talk about more complex overloaded version; other version just takes default values for unexposed parameters):

    public GetStatus Merge (string sourcePath,
                        string targetPath,
                        VersionSpec versionFrom,
                        VersionSpec versionTo,
                        LockLevel lockLevel,
                        RecursionType recursion,
                        MergeOptions mergeOptions)

First two parameters specify source path and target path for the merge. Those may be either server or local paths (the target path must be mapped in workspace prior to merge).
VersionFrom” and “versionTo” parameters may be used to specify range of versions to perform merge from. One may use that in several different ways. By default (if null is specified for first parameter, null or latest version for second), all unmerged changes in source will be merged. If same changeset is specified in both parameters, only changes in that changeset will be merged (selective merge). To merge changes up to specific changeset, first parameter may be null and second version up to which merge required.
LockLevel” parameter specifies what lock will be set on changes pended (LockLevel.None will usually do – I cannot at the moment think of any scenario you would like other lock).
Recursion” parameter specifies level of recursion to use on source and target paths (usually with folders one will use RecursionType.Full).
MergeOptions” parameter is the one that defines merge initial behavior. It may have the following values (see MergeOptions enum):

  • None – no special options (same as using Merge wizard in VS UI)

  • AlwaysAcceptMine – discard any changes in source, and just update merge history (same as discard option in tf.exe merge command, not available in VS UI). Essentially, the option says to resolve all conflicts using Resolution.AlwaysAcceptYours (more on that below)

  • ForceMerge – do not look at merge history and perform merge for specified range of versions from source as if no merges were performed (same as force option in tf.exe merge command, not available in VS UI). When that option is specified, “versionFrom” and “versionTo” parameters must be set in call to Merge

  • Baseless – perform baseless merge (when source and target items have no branch relationship between them)

  • NoMerge – do not perform actual merge (same as preview option in tf.exe merge command, not available in VS UI)

All options except NoMerge are mutually exclusive.

After all parameters values were specified and Merge was invoked, the next thing to do is to look at returned value (of type GetStatus). Personally, I dislike it very much as it provides information in rather incomprehensible way - each field in return value as well as their combination tell you what happens in merge.

Possibilities are (I know those by trial and error, so probably there is still dearth of other interesting combinations):

  • NoActionNeeded == true && NumOperations == 0  – means that no changes in source needed to be merged, so no actual changes were pended

  • NoActionNeeded == false && NumOperations > 0 && HaveResolvableWarnings == false  – means that merges were performed, but all conflicts were resolved automatically. Need to check in pended merge changes and that’s about it

  • NoActionNeeded == false && NumConflicts > 0  – merge was performed and there are conflicts to resolve

First two cases are obvious. In the last case there are conflicts and resolution is required. I will talk only about simple conflicts (content changes) and not rename/delete changes (those are kinda complicated, and I will leave that to MS guys with access to source code; besides I doubt if there is any merit in automatic merge or conflict resolution for delete/rename changes).

Let’s try to implement conflict resolution algorithm similar to manual merge in Visual Studio.

First, one needs to retrieve list of conflicts:

Conflict[] conflicts = workspace.QueryConflicts(new string[] { "$/Project/Branches/Solution" }, true);

The method QueryConflicts is pretty obvious – it returns all conflicts on the specified paths in the workspace, last parameter specifying whether query should be recursive.
Now it is possible to iterate over the conflicts and resolve them one by one:

    foreach (Conflict conflict in conflicts)
        if (workspace.MergeContent(conflict, true))
            conflict.Resolution = Resolution.AcceptMerge;

The code above calls for each conflict method MergeContent, that will invoke visual merge tool. After the user performed visual merge (known by MergeContent return value being true), the conflict is ready for resolution.
To resolve conflict, the Resolution property of the conflict is changed according to the resolution (see Resolution enum). Possible values are (I discuss only options relevant to simple merge scenarios):

  • AcceptYours – local version is to be used for merge

  • AcceptTheirs – server version is to be used for merge

  • AcceptMerge – resolve conflict by doing manual merge

Now, in my code I use AcceptMerge as it is expected that user will create version using the merge tool locally.
After the resolution is set, ResolveConflict method is called to signal that conflict is resolved (if resolution succeeded, IsResolved property of the conflict will now return true). By the way, if we talk about conflict properties already, another useful property is IsNamespaceConflict; it lets you know whether conflict is in file content or in version control namespace (rename etc.)
Suprisingly, AcceptMerge resolution option goes an extra mile for you and does something similar to code snippet below

    if (conflict.IsResolved)
        File.Copy(conflict.MergedFileName, conflict.TargetLocalItem,

After the resolution, you have Edit pending on the file and merged file as a local copy.

Once all conflicts are resolved it is possible to check in changed files and thus complete merge.

But resolution of conflicts raises several additional issues (even if not taking into account renames and deletes). For example, if during conflict resolution one specifies that source version should be taken in merge that essentially means that local file after resolution must be identical to source version. It turns out that ResolveConflict will handle those situations for you: for example, after resolution with AcceptTheirs you will have source version in your workspace without doing anything extra.

Obviously, the same steps to conflict resolution may be used when resolving conflicts that occur on check in (though I am not sure I can see ready automation scenarios there).

In conclusion, I would not recommend using Version Control API for merge and conflict resolutions but rather recommend sticking to command line tf.exe client for advanced scenarios. While the thing is doable, you should be prepared to spend quite an amount of quality time on it and be prepared later to fix bugs (mostly related to myriad scenarios and border cases you did not think of).

Please take the examples above with a grain of salt; if you find errors/omissions do let me know so I can keep it updated.

Friday, November 24, 2006

(Not) getting latest on check out - a bug?

Did you know that TFS will not automatically get latest version on check in? And what do you think about it?

In all probability, you know about that particular feature and roll with it (if you use TFS that is). But the question that appears to be still actively discussed is whether it is such a big deal, that “get latest version” is not done automatically on check out. When I started using TFS, at first impulse I thought it to be somewhat problematic (coming from VSS background). But after I worked with TFS for a year and being now somewhat older (and may be even wiser) I do not feel that way anymore, and even consider it an advantage of TFS Version Control over Visual SourceSafe. But as it happens, there is an opposite point of view; out of disagreement with it I was moved to write that post.

So let us start with short preamble. If you used Visual SourceSafe for version control, you used it with exclusive check out only (yes, I know that it allows to check out files concurrently, but never ever heard about success story of using VSS in that manner – while hearing lots of stories to the contrary). When you perform check out using VSS, it conveniently retrieves latest version for you and makes it writable.

Enter TFS Version Control. By default, check out is performed concurrently. When check out is performed, the local file is made writable (no version is retrieved from server).

My conclusion at that point would be “Wow, Team Foundation Server is not the same as Visual SourceSafe and uses different source control model, so we need to learn something about it and may be even adjust our practices!” Should have been no brainer, that one, don’t you think? But strangely enough, people tend to overlook that point from very beginning and try to use TFS VC as next version of VSS.
At that stage the typical issues that arise are

  • TFS won’t get latest version for me before check out

  • TFS will perform concurrent check out by default

  • TFS by default will not place any lock on checked out file

Can we work around these issues to make TFS exactly like SourceSafe? Unfortunately no. Is TFS any worse for that? Absolutely not. And I am going to prove that!

I am going to take real life example. Before I got my hands dirty with TFS, I participated in development of largish application as part of team of around 30 developers, using VS 2005 and VSS for source control. Let’s have a look at typical check out and what can (and did happen) after.

  1. The file I checked out was not modified on server. Then essentially I have latest version on my workstation, and simply making it writable would suffice. As nothing has changed, I shall be able to compile my project without problems.
  2. The file I checked out was modified by someone else and checked in, so my version is outdated. Get latest retrieves newer version but luckily the changes are such that when I perform build it compiles.
  3. The file I checked out was modified by someone else and checked in, so my version is outdated. Get latest retrieves newer version but now the changes are such that my project does not compile (for example, the method signature in that file was changed, and that very method is used throughout other project files).

As one may see, only third case creates a problem. Now, I bet that when you checked out that file you were not going to integrate the changes made in that file on server into your project. My guess would be that you are implementing some feature, and implementing it requires file modification so you checked that out. And now you cannot compile your project.
So instead of doing coding you are integrating changes. If the project is large, you probably will not perform “Get latest version” on whole project recursively (as in VSS it will take eons of time, and you are in the middle of development!). What you do is to try and handle the files one by one – let’s perform get latest for the files that break my build! Surely that will help! Ok, you do that. And it turns out that latest version of that other file breaks your build in some other place. That’s called chain reaction! At that point you have two choices – either perform “Get latest version” file after file until the project compiles, or start that recursive “Get latest version” beast and go pour some coffee (I assume that beating the crap out of guy who broke your build is not a valid alternative :).
Here we go as far as VSS is concerned – out of three cases, two work nicely and one is a major pain. I worked over VPN oftentimes (with full recursive “Get latest version” on the project taking lots of time), so I was full of apprehension every time I checked out “popular” file.
I can understand why dev’t guys at Microsoft wanted to help out the users in that problematic case. TFS solution is elegant, easy to understand and supports concurrent development at that! But it appears it is never a good idea to take away freedom of choice (even if it means preventing people from shooting themselves in the foot). Anyway, here goes TFS solution:

  1. The file I checked out was not modified on server. Then essentially I have latest version on my workstation, and TFS makes it writable. My project compiles as it did before check out, and there will be no problem to check in, as I am the only one who changed the file
  2. The file I checked out was modified by someone else and checked in, so my version is outdated (but the changes are such that they do not affect other files). Local file is made writable, my project compiles as it did before check out and all is well until check in. On check there will occur a conflict for me to resolve (more on that later).
  3. The file I checked out was modified by someone else and checked in, so my version is outdated (the changes are such that they do affect other files). Local file is made writable, my project compiles as it did before check out and all is well until check in. On check there will occur a conflict for me to resolve (more on that later).

In TFS, two cases created a problem for me instead of one in VSS! What’s happening here? We have paid some serious money for that souped up VSS and it cannot even check in files, huh?
In fact, several things happened, not all of them obvious.
First, what you got is a boost in immediate productivity – developer is allowed to develop (supposing one does check out in order to add new changes) without interruption, integration is delayed to development completed stage.

Second, there is overhead for conflict resolution on check in. That part here is tricky, and I am afraid that here it will be my personal opinion vs. yours. But as it is my blog, I am not afraid, so I can state plainly that overhead depends on quality of your code, quality of your development tasks and your engineering people.
If the code is modular, and each engineer performs well-defined task in all probability the conflict will be resolved automatically – that is new version of file will have changes non-overlapping with changes in version made by other developer (talking about agile shops, there developers may more often perform code breaking tasks; but in agile settings effective communication is the key, so assuming that it is relatively easy to handle merge conflicts effectively and in real time).

But in real world we have code breaking changes, and code does overlap! Does TFS do better job then by highlighting those conflicts after the fact (as compared with VSS that by breaking your build signals you before the fact)? In my opinion, TFS VC approach is indeed better and here is why. You check in your changes, the conflict cannot be resolved automatically, and you have that three-way merge window to stare at. At that point, you either qualified to perform merge, or not qualified to do so. How can you be not qualified to do that? For example, if code you have written does different thing from the same lines of code in server version of file. But wait a minute, that’s a sign of a different problem! You have been doing your job in parallel with someone else, and at that point in TFS there surfaces the problem while VSS would be hiding it!

I am well aware that my reasoning is not perfect, but overall I believe that adoption of TFS will lead to significant productivity gain over VSS, even though that may require some changes in work habits. But would you like to do concurrent development? If so, how well would your VSS-centered model fare? Your new practices should answer those questions as well, may be even before you start thinking how that get latest stuff will affect your development.

To conclude, while that get latest thing may seem a deficiency, and surely the user must have a choice (as one apparently will in TFS v2), I do not view it as a showstopper, and strongly believe that client may be made aware of TFS advantages over VSS using that very feature (or an absence of it – depending how you look at it). It seems that Microsoft somewhat underestimated the impact VSS had on the development practices over the years; but as VSS addicts have more hands on experience with TFS I do hope that VSS-only work patterns will fade away.

And to add up to this argument, some links from MS development team on that very subject:
Buck Hodges blog post (read the comments as well)
Adam Singer blog post (excellent read, but be prepared - it is longer than that one)

I tried to be as concise as possible (unless Thanksgiving dinner somewhat got in the way :), but please drop me a line to know what you think and where I might have been wrong.

Tuesday, November 14, 2006

Creating custom tasks for Team Build

When you create task for Team Build, the usual rules of custom MSBuild task apply, but there are little differences.

Probably, when you create custom task for Team Build you will want to use TFS object model to access version control. For that purpose, first you will need to establish connection to TFS server.

Most obvious approach would be to implement task with server name parameter and establish connection using that parameter:

public class SimpleTfsTask : Task
    private string _serverName;

    public string ServerName
        get { return _serverName; }
        set { _serverName = value; }

    public override bool Execute()

        TeamFoundationServer server =
        // ...

Now, the downside of the approach is obvious - the server name parameter should be either placed inside the build script or in somewhat better way supplied as external parameter for the build.

But wait a minute - how come that most of Team Build predefined tasks (for example, those in Microsoft.TeamFoundation.Build.Tasks.VersionControl assembly) do not have that parameter? Whence the Team Foundation server url is retrieved?

The solution is rather simple - Team Build tasks that do not have server parameter must be used in context of workspace (that is it is supposed that build workspace exists before those tasks are called). Tasks that cannot have such context (for example, CreateWorkspaceTask) have TeamFoundationServerUrl parameter.

Let's have a look at how to get Team Foundation server url given valid workspace.
The code below first gets workspace in which specified local path is mapped (not Workspace class but rather WorkspaceInfo that uses local cache) and then Team Foundation server data is retrieved from that workspace properties.

public class AnotherTfsTask : Task
    private string _localPath;
    public string LocalPath
        get { return _localPath; }
        set { _localPath = value; }
    public override bool Execute()
        WorkspaceInfo workspace =             Workstation.Current.GetLocalWorkspaceInfo(_localPath);
        TeamFoundationServer server =             TeamFoundationServerFactory.GetServer(workspace.ServerUri.AbsoluteUri);
        // ...

Obvious advantage of the second approach is that if you have workspace (retrieved either by path or by workspace name), you may always create Team Foundation server instance to access version control or its other services.

In conclusion, I would like to note that it is wise to use TeamFoundationServerFactory rather than constructor to create TeamFoundationServer instance, as factory approach uses caching and given that usually single build script connects to only one TFS server that may result in noticeable performance boost.

Wednesday, November 08, 2006

TFS folder items and history

Recently, Richard Berg wrote very concise post on what are items in TFS version control. In the post he states "TFS rarely makes a distinction between files and folders". While that surely true on a high level, there are still some important differences in small implementation details.

The one difference I'd like to address is history representation. The history for the folder displayed in History tool-window in Visual Studio contains both changes performed on the folder itself as well as changes to any files/folders contained in the folder.For example, when folder is renamed the change will appear in its history together with the change resulted from new file added to the folder.

To make things more interesting for the user, History window for the folder does not indicate whether the change performed affected the folder or only the files in it (in History Sidekick we tried to display that information - history entries that include folder changes are marked as such). Also the history for the folders does not allow version comparison by simply selecting two changesets in list (though you may compare folder versions using Tree Diff from TFS Power Toys).

But other than that, files and folders do behave alike and I hope that in next TFS version the folders history will be much more similar to files.

On related note, I would highly recommend to watch Richard Berg's blog as he unveils the mysteries of TFS Version Control in his upcoming posts (two posts so far, so that's the high time to jump on the bandwagon).

Sunday, November 05, 2006

Changing work item types affects source control

You'd ask how will changing work item type affect source control? My first answer would be - it will not really change anything as only connection between source control and work item tracking is through changesets or files association to work items. But it turned out not to be not that simple.

Let's say that you have introduced new validation rule as a result of work item type change, and some field that was optional become mandatory. If there were existing items at the time of the change, that will mean that upon changing and saving any of these items you will have to specify value for the field that become mandatory. But how that is related to source control?

Here goes the scenario: you are checking in some files and want to associate them with work items in "Pending Changes" window, and work items have data that become invalid as result of work item type conversion. The files to check in are selected, work items to accociate with are selected, you hit Check In - but that's no go! The following message is displayed (clicking on image will display larger screenshot):

And if you think about it some more, it really makes sense since the default check-in action for changeset and workitem association is "Resolve", and that will require
work item modification.

But what about changing that check-in action? Let's change that to "Asscociate" and see if check in will fare any better. No luck here (and I would say the result is even worse) - files are checked in, changeset is created but association to work items is not created, as seen on the following screenshot (clicking on image will display larger screenshot):

In that case, I am not so sure that the behavior exhibited is one expected. I am doing the association between work item and some other artifact, should the full validation be performed? And why the validation is performed after check in?

To conclude, that would be wise to handle data conflicts in existing work items as part of work item type modification. When you have hundreds of work items and modify the type, you do not want all your developers fill in the missing data - that is clearly part of conversion (and I am not talking about breaking changes; making several fields mandatory can seriously affect productivity).

Tuesday, October 31, 2006

How to handle paths with TFS Version Control object model

After I have been developing with TFS Version Control object model for quite a while, I have come across very helpful class. I wish I did that couple of months before, as it would have saved me quite a bit of time in writing them string parsing functions.
I am talking about VersionControlPath class. This is a static class located in Microsoft.TeamFoundation.VersionControl.Common assembly, and it contains ton of routines you might need when working with version control items.

To give you a small sampling:

// Returns folder name from item path ("$/Project/Folder" from "$/Project/Folder/File.txt")

public static string GetFolderName(string item)

// Returns project name from item path ("Project" from "$/Project/Folder/File.txt")

public static string GetTeamProjectName(string item)

// Check whether specified path conforms to Windows or TFS path syntax (basically contains / or \ delimiter)

public static bool IsServerItem(string path)

// Prepends path with root $ char if required

public static string PrependRootIfNeeded(string folder)

// Parses item path and returns folder and file item paths

public static void Parse(string item, out string parent, out string name)

There is a dearth of other methods in VersionControlPath. Most of them are self-explanatory named, but may be a bit tricky to work with. For example, IsValidPath method will return true both for "$/Project/Folder" and "/Project/Folder" paths.

I wish I could direct you to MSDN, but documentation there is pretty thin (to be diplomatic about it).

If you find any interesting gotchas in that class, I would be delighted to know. Drop me a line.

Tuesday, October 24, 2006

HasTrailingSlash in MSBuild scripts

There is useful undocumented function that may be used in MSBuild scripts conditions, called HasTrailingSlash. As its name implies, the function checks its only argument for trailing backward slash:

Sample updated at 01-Fev-2008 (recently somebody complained that while function works my old sample does not :):

<!-- Initial target will be always called first to verify properties -->
<Project xmlns="http://schemas.microsoft.com/developer/msbuild/2003"
    <!-- Target verifies that the property is not empty (throws error) -->
    <!-- and has trailing slash (adds one if no slash provided)        -->
    <Target Name="VerifyInputParameters">

        <Error Condition="'$(ExternalPath)' == ''" Text="ExternalPath is empty" />

        <CreateProperty Condition="!HasTrailingSlash('$(ExternalPath)')" 

              <Output TaskParameter="Value" PropertyName="ExternalPath" />

    <!-- Here goes the rest of the project -->


 <OutputPath Condition=" !HasTrailingSlash('$(OutputPath)') ">




The thing is, the function is used extensively in Microsoft system targets files (for example, in Microsoft.TeamFoundation.Build.targets file) and still does not appear in MSBuild documentation. Beats me...

See also new TFSBuild site for more details.

Monday, October 16, 2006

Getting associated work items for changeset

Lately we have been working at next version of our Team Foundation Sidekicks application (more specifically at labels-related Sidekick), and as part of the development it was required to retrieve all work items associated with changeset.
Initially, the following simple code was used:

Changeset changeset = server.GetChangeset(changesetId);

Once Changeset object is instantiated, its WorkItems property readily makes all associated work items available.

The problem was the method performance; so after some search I found Naren Datha's post solving the same task. The code there uses WorkItemStore and ILinking objects and looks much more complex (as compared to one liner we used before). Heck, it must work quicker (or so I thought)!

But after some benchmarks, it turned out that those two approaches have very similar performance. I did not perform exhaustive research, but the execution time was essentially the same (within 10% delta).

So my guess would be that both methods use same core queries; and as far as complexity goes we shall stay with our previous one liner. Wouldn't you?

Removing iteration hidden goodies

I have just found a new feature in TFS! That still amazes me how after a whole year I am involved with TFS, there are still hidden and unexplored places to go.

In short, I was going to remove iteration path I have created by mistake, and there TFS goes displaying the window shown below:

Frankly, I was not thinking about work items at the time as I knew the iteration path was without any items assigned, but TFS thinking about that on my part - that is all goodness. Sure, if I had some work items I would like to migrate them, and the thoughtful dialog displayed is not a bad way of doing it.

I guess the possible usage of the feature (besides deleting iterations created by mistake :) would be for moving work items in iterative development. For example, in one setup we frequently assign all strange low priority tasks to iteration "Backlog" in milestone; so it would be convenient to delete that backlog and move all leftover backlog work items to next milestone.

It would be interesting to see if someone out there is in fact using that feature.

Wednesday, October 11, 2006

Renaming Team Project

When you create new Team project, give some serious consideration to its name. The Team project cannot be renamed in TFS v1 (No comment about it - you may find some opinions in MSDN usergroups).

The rename operation seems so obvious to the folks that I thought it is worth to post about it. You may well imagine with what incredulity the absense of the feature is met when need to rename project arises (there you have some really colorful language :).

If you already named your project and looking at ways to rename it, you may use the following workaround (sort of workaround, because you won't really end up with renamed project with identical contents):
1) Create new project with desired name
2) Copy work items from old project to a new one (one-by-one, as there is no bulk copy option).
3) Move all source control folders from under old project folder to a new project folder
4) Sharepoint portal documents cannot be moved in bulk (as far as I know), so you do that manually
Move of source code will retain the files history, and work items will also have partial history, but overall I would say the workaround does not worth the labour (and Sharepoint docs will not have their history). At any rate, selecting right name in the beginning beats any workaround by far.

So be circumspect when you name your Team project!

Update: I stand corrected, as there exist freeware utility for moving work items between Team Projects (written by Eric Lee); see posts here and here.

Saturday, September 16, 2006

Cached TeamFoundationServer

Yesterday I have read Buck Hodges blog post on how to get instance of TeamFoundationServer. Getting TeamFoundationServer is the first thing one does when writing code that utilizes TFS Version Control object model, so naturally it is worth to know. Frankly, I did not think I will discover something new, but you live and you learn...

Two choices available are TeamFoundationServer class constructor or TeamFoundationServerFactory GetServer method. Buck covers the usage quite nicely in his post. The point of interest for me was that TeamFoundationServerFactory method will actually return same object in two different calls if given same URL as GetServer parameter.

That means if you use GetService method of returned TeamFoundationServer instance, it will essentially be the same service! So for example, if you retrieve VersionControlServer and hook up onto some event, you will need to do it only once; the second instance of TeamFoundationServer returned by factory will be the same and will return the same VersionControlServer with event handler set (below is pseudo code just to visualize the idea; no chance it will compile):

// first place
tfs1 = TeamFoundationServerFactory.GetServer(url);
vc1 = tfs1.GetService();
vc1.NewPendingChange += event1;
// second place
tfs2 = TeamFoundationServerFactory.GetServer(url);
vc2 = tfs2.GetService();
vc2.NewPendingChange += event1; // not required! already set

So that is something you'd want to keep in mind while writing your applications.

P.S. And some additional piece of wizdom from commentaries to the post:
"I recommend obtaining services from TFS OM for all services except the WorkItemStore. It is not thread safe, where as all other services you obtain from TFS OM are. To work around this issue, create a new WorkItemStore object and pass the credentials that you get from the TFS OM."
It is not official and I did not check that, but I love to assemble those bits of information. You never know when it may come in handy ...

Friday, September 15, 2006

Merging Visual Studio solutions

Recently, I have read a post in MSDN, and that reminded me of important issue I meant to raise for quite some time.
The scenario is rather simple - let us say that you branch folder that contain Visual Studio solution (or project); then you perform development in both branches. At some stage you merge one branch onto another.

While it is obvious how code files (C#, C++ etc.) are merged, for Visual Studio project and solution files it is less so. Even if you do not peform advanced changes in those files (for example, specifying different custom pre-/post- build steps), Visual Studio itself may change the file (see the problem is described in the post). And when you merge, usually there is no conflict and changes are merged automatically and thus you can end up with invalid solution or project file!

It appears there is no magic bullet solution for the issue in current version of TFS. What I do is essentially manual procedure: the idea is to check whether any solution/project files were merged. In most cases there are no conflicts to resolve, so I manually review the merged solution/project files before checking them in, to make sure that automatic merge changes make sense. It may be paranoid but is way better than broken solution.

More than that, after some thought on the subject, I do not see how it may be handled (aside from customized merge wizard specifically for Visual Studio solutions and projects). Any thoughts on the subject would be appreciated (I believe Microsoft guys will thank you as well).

Wednesday, September 06, 2006

Copying work items - hidden gotchas

Today I came across Eric Lee's post about copying work items. I also discovered this function quite accidentaly and have been happily using it for several months already.

So you right-click on selected Work Item in Query Results or on open Work Item, and click "Create Copy of Work Item..." - and voila! New item with identical data is displayed for you, so you can modify and save it. It allows one to avoid hassle of copying common fields or easily copy item to another project.

All goodness, but there are some not so obvious features within...

First, the newly created work item will be linked to the source work item (work item you copied a new work item from). If that is not your intention, and you do not glance on "Links" tab contents - you are in for surprise. And if you do that for some time then you have a whole lot of links. For example, if you have Item 1, then created Item 2 (by copying from Item 1) and then created Item 3 (by copying from Item 2) - now, how many linked items you will have in Item 3? You will have two - Item 1 and Item 2. That is surely a feature to be aware of (especially if you do not want to link those items)! I have discovered it only after I created the whole bunch of interlinked items...

Additionally, there is something very interesting in the history of the newly created work item. If you are creating items one after another (as in example above), all that information will be saved in history!

Here you can see copied work item history:

And here the first history entry expanded (and that is only a part of it):

Not that I care much about that information currently. It may be useful if you are trying to propagate bug through several Team projects (say bug found in "Project 1" will be copied to "Project 1.1" and then to "Project 1.2" - the data will be visible in history); but with current implementation of Team projects I doubt it is of much use. On the other hand, if you are copying items only for convenience, I do not see how that information is useful to anyone.

Those two I have discovered in a course of some two months of usage; but I will not be surprised if there are additional goodies in that function. And I wonder - what was the idea of the original author?

Friday, August 18, 2006

Circular references in Active Directory

Recently in user feedback on Team Foundation Sidekicks application we have dealt with issue that may be of importance to people with complex Active Directory dependencies between user groups.

To put it simple - if you have circular relationships between user groups in AD and those groups are used in TFS, IGroupSecurityService ReadIdentity method will fail. I did not see it but on one occasion and do not know how that API is used in TFS core, but my experience with Active Directory tells me that circular relationship is not something very rare to come by.

So if you feel that way also, have a look on the following post.

Shared workspace mappings

For everyone that has used TFS source control it is well known fact that it is impossible to create more than one workspace mapping to same directory (in single or different workspace on same workstation). When you try to do that, TFS error message pops up informing you that the path is already mapped somewhere.

For non-shared computer there is no probem in the situation; you just use the workspace with mapping or create a new path.

Now, on workstations used by several people (for example, integration stations) there may be lot of value in the "shared" workspace, namely so that each user after logging in has mapping in his workspace to the same path. Until today, I did not think that possible, but came across very interesting post in MSDN newsgroup that suggests a solution.

The solution is simple and elegant! You just map disk drive to the directory, and while users Alice and Bob have mappings in their workspaces to G:\ProjectA and F:\ProjectA, they in fact will be working with same project in c:\src\ProjectA. One might note that it is not single shared workspace but instead workspace per user with the same mapping, but heck - that's the best solution we have! Surely beats having separate directory for each user just for mapping.

Kudos to Nate for suggesting the solution.

Sunday, August 06, 2006

Maximum number of Team projects

Today while reading MSDN newsgroup I came across interesting piece of information that I was not aware of before. It turns out that Team Foundation Server supports maximum of 500 (five hundred) Team projects (as stated in official documenation); though in the newsgroup thread it is stated that the limit is not a hard one, it is still an official number.

While 500 is quite sensible number, I do find it rather disturbing; while not many organizations will have more than 10 millions of versioned items per project (10 millions as limitations go is not a very restrictive one), I can visualize organization having 500 logically separate software products or modules.

As Microsoft prefers working with a few projects (separating modules by using source control structure and area paths) it appears that having great many projects is rather gray area as of now. I would definitely say that the limit is one to be aware of in planning, though hardly a critical limitation.

Thursday, July 06, 2006

Apply Label adventure

When one performs "Apply Label" on file or folder in TFS source control, the operation may lead to not entirely expected results.

Let us say that Bob has selected file $/Northwind/foo.cs, right-clicked it and chosen "Apply Label" menu. File is selected by default, thus Bob proceeds from "Choose Item Version" dialog to "Apply Label for foo.cs" dialog. There Bob specifies label name Test Label and hit OK. As was expected, new label is created with foo.cs contained.

But when Bob performs the same sequence for file $/Northwind/foo1.cs and specifies same label name, the file is added to existing label! User is not notified in any way that label already exists, and after operation is completed, Test Label contains two files, foo.cs and foo1.cs. More than that, if one would specify label name as tEST lABEL, TFS will still consider that to be same as Test Label.

While operation is called "Apply Label" and one may argue from the point of wording it does exactly that - applies label to the item versions selected (creating it only if needed), I believe that is the behavior to be aware of as inadvertently user may easily add items to existing label while believing that new label is being created (to say naught of case sensitivity in label names).

Monday, June 26, 2006

Get latest version on check out

The question whether "Get latest version" of file should be performed on file check out by TFS rather than by user manually was discussed multiple times in the past and is being raised almost on weekly basis in forums.

While it appears that Microsoft will provide at least configuration option in the future versions, it will not be in SP1 (see Richard Berg post).

As Microsoft has been promising to release SP for qute a while, and it is still in the works and will not contain the feature mentioned, it appears that the community will have to learn to get by without that option. I hate conspiracy theories, but it surely appears that the users are being educated against their will. We all shall witness what would be the result.

In my opinion though, at least survey on the issue would be a great idea. Hopefully, someone at Microsoft is listening...

Monday, June 12, 2006

Source control path length limitation

When defining folders in source code control repository (or adding local paths hierarchies) it may be useful to keep in mind that maximum path length (of concatenated path) may not exceed 260 characters both for server and local path.

I thought that the days of MAX_PATH (for those who are in the know) are long gone, but it appears that TFS has the maximum length limit in its database (see post about the issue).

To make a personal confession, having upper limit imposed is not any problem for me. Sensible folder hierarchies and file names should not take more anyway.

Generally, the most probable case for the issue to occur is to use long local paths (the worst ones of the form C:\Documents and Settings\dumb_user\My Documents\Visual Studio 8\Projects\...) - but in those cases the user education will solve the issue. It may be even for the better that such users will receive maximum length error - source code has no place in My Documents.

Friday, May 05, 2006

Pending rename folder change

When renaming folder in Solution Explorer tool window in Visual Studio (vs. performing the operation in Source Code Explorer), I have discovered one tricky point.

The following sequence of operations is performed:

* Initiate operation by using "Rename" pop-up menu on selected folder in Solution Explorer

* Change the folder name

* The "Check Out Files" dialog will appear to check out project file; set the desired lock type and click "OK"; renamed folder will appear in the project. If the "Check Out automatically" on edit option is set, the user will not be prompted and required files will be checked out automatically

* Check in the project files, using either "Pending Changes" window or "Check In …" pop-up menu in Solution Explorer or "File->Source Control->Check In …" menu, and clicking "Check In..." in the appearing dialog

Now, the walkthrough appears to be pretty obvious. But there is a catch - if the "Filter by solution" toolbutton is toggled in check in window to display only solution's changes, the "rename" change is not there! And it will dangle there for ages pending, unless you toggle off the filter or look into Source Code Explorer.

All in all, there is nothing complex here, but in my opinion the point is definitely one to be aware of.

UPDATE It was confirmed by MS that the behavior is a bug and is going to be fixed. There is also another minor bug mentioned in the MS post, so read and be enlightened.

Tuesday, April 25, 2006

"Resolve conflict hangs Visual Studio" issue

UPDATE Microsoft devs say that
* The issue occurs only rarely
* The problem usually may be localized to single file, so workaround suggested is sufficient
* Fix was already implemented to be distributed in first patch

So overall that downgrades the issue from "panic mode" to "something to be aware of".

See the full discussion on the issue.

Some of TFS users complained, that following scenario can get the application to hang:

1. Perform Merge between two branches
2. If there are conflicts, they are identified Conflicts dialog is displayed
3. Clicking Resolve brings up Resolving Conflicts dialog, saying that summary is built.
At this point application never comes back. And whats more, it happens for some files at some conditions (not exactly identified).

Saturday, April 22, 2006

How to determine source controlled solution workspace in Visual Studio

When you have several solutions in different workspaces, it is not always easy to remember what workspace you work with (say, you load the solution into Visual Studio from file system), but right workspace you need to specify when performing operations such as branch, that are not supported from the Visual Studio Solution Explorer, but rather from the TFS Source Control Explorer.

The answer is short - when you load the solution and open the Pending Changes window, the workspace selected by default is the one the solution is associated with.

When you change the workspace though, your selection will be remembered until solution is reloaded.

Thanks to Richard Berg for the answer.

Thursday, April 20, 2006

Opening local solution of other user

The following scenario may occur on shared computer:

1. Bob logs in using his account to computer A
2. Bob performs get latest to folder for the solution in his workspace
3. Bob performs required activities with the files in his workspace, finishes up his activities and logs off
4. Alice logs in using her account to computer A
5. Alice opens solution in folder using "Open..." Visual Studio menu
6. Error that "bingings are improper..." is displayed

The scenario is supported by Visual SourceSafe and is not possible with TFS (see error in bullet 6).
What user (Alice) should do in TFS is to perform "Open from Source Control" for solution in her workspace2 to folder2.

Additional point you should be aware of that message saying bindings are incorrect may be caused by the scenario above.

See issue discussion here.

Tuesday, April 18, 2006

View deleted files the Source Control Explorer

The Vertigo blog describes very useful feature I was looking for for ages.

If you want to Undelete deleted files in GUI, you may set Source Control Explorer to display deleted files.

The option may be set using menu Tools->Options->Source Control->Visual Studio Team Foundation, by checking magical "Show deleted items in the Source Control Explorer" checkbox.

Every TFS practioner must-know-how-to!

Undo add of solution/project

If you have performed addition of unbound solution/project to TFS source control, and want to undo the operation, there are several gotchas.

First, you perform "Undo pending changes" to undo pending Add change. All is kool, the operation is undone and it is possible to work without source control as before.

But let's say that you finally decided to add the solution/project. Several findings here:

* "Add to SC" functionality is not available! Ok, I have undone the operation but now I do want to add my solution!
Workaround: must go to File->Source control->Change source control, and there one will discover that there still exists binding of the solution to TFS. Unbind the solution/project, and then "Add to SC" functionality becomes available

* Now when performing the Add, TFS does not ask you for mappings. It adds your files to the path you specified before Undo operation!
Workaround: must go to your workspace, and there one will discover that there is mapping for your solution. Remove that mapping - then on Add TFS will ask you to specify the path.

Overall, I would say those are the facts to be aware of (as Microsoft says here there are aware of those somewhat unobvious workarounds, so any TFS user also should)

Conflict resolution window

It has been stated in Chris Rathjen blog that conflict resolution window may be displayed in three different cases.

Must check that thoroughly! Have seen those empty windows now and then, never quite knowing why...

Monday, April 17, 2006

How lock set on file is affected by check out

When check out operation is performed on file, the lock set will be that of the check out option specified. It may lead to some not entirely obvious consequences.

Obvious example
1) No lock on file prior to check out
2) After check out is performed with lock option other than "None"("Check-out" or "Check-in" lock), the file is editable and has changes "Lock,Edit" associated with it (in "View Pending Changes" window)

Not so obvious example
1) Lock on file prior to check out ("Check-out" or "Check-in" lock)
2) After check out is performed with "None" lock option, the file is editable and has change "Edit" (in "View Pending Changes" window)
That is, previously set lock level was essentially removed (demoted) by using check-out option.

So lock level may be either promoted or demoted by check out operation (the situation occurs only in case lock is owned by user performing check out or user has admin permissions for locks). Personally I think that is the thing to watch for or at least to be aware of.

As Microsoft explained in their answer, locks are affected only by explicit check outs; implicit check outs (for example, when auto check out on edit is enabled) are not affected.

Monday, April 03, 2006

How to create and add to SC solution with controlled projects

The idea is to have several Visual Studio solutions under source control - one solution containing all projects for release purposes and smaller modular solutions for development of different modules.

I have proceeded to achieve the goal as following:

1. Created new solution and added to it several projects
2. Added the solution with all projects to SC (using "Add solution to SC" menu)
3. Closed the solution
4. Created new empty workspace
5. Created new blank solution
6. Added existing source-controlled project to the solution using "Add project from SC" menu; mapped project path to be subfolder of the solution folder. At that stage I have modular solution with single project and I would like to add that solution to SC - and the things started going wrong
7. Tried to add solution to SC using "Add solution to SC" menu. Solution is not added and error messages are displayed to the effect that there are no pending check ins on solution files.

As was explained in the following discussion, to avoid these errors and reach the goal, the scenario must be somewhat revised. So instead do the following:

4. Create a new empty workspace, don't add any mappings.
5. Create a new blank solution and check the "Add solution to Source Control" box. It will prompt you for a location in SC to store the solution. If you created the new solution in an unmapped folder, there will also be a dropdown to select the workspace.
6. Bring up the "Add project from SC" dialog. After you choose the project (and location on disk if necessary), it should appear in Solution Explorer as already bound (with padlock as status indication).
7. Checkin the solution. Since the project is already bound, you only have to checkin the two solution files.