Καλώς ορίσατε στο dotNETZone.gr - Σύνδεση | Εγγραφή | Βοήθεια

Roslyn VISX installer crashes with System.UriFormatException? Try installing as Administrator

Installing Roslyn on a machine where I had local administrator rights worked without issues. At work though, the installer crashed and the following event was logged:

 Fault bucket , type 0
 Event Name: CLR20r3
 Response: Not available
 Cab Id: 0

 Problem signature:
 P1: vsixinstaller.exe
 P2: 12.0.30324.0
 P3: 532f5607
 P4: System
 P5: 4.0.30319.18408
 P6: 52311185
 P7: 1cf7
 P8: 75
 P9: System.UriFormatException
 P10:

To overcome this, open a Development command prompt with administrator priviledges, go to the folder where Roslyn is stored and execute:

    vsixinstaller "Roslyn End User Preview.vsix"

Δημοσιεύτηκε στις από το μέλος Παναγιώτης Καναβός | 0 σχόλια

Asyncrhonous programming in .NET Made Simple: From Delegates to the Task Parallel Library

A few people have been asking lately for samples of using asynchronous programming using Sockets or Pipes so I decided to put together some samples of the various asynchronous programming models using .NET. The scenario used in all the examples is the same:

  1. A server pipe is created and starts waiting for connections
  2. A client pipe is created and connects to the server pipe
  3. The server starts listening for data
  4. The client sends a UTF8 formatted number to the server asynchronously and waits for a response
  5. The server responds with “You sent x” to the client asynchronously
  6. The client reads the response and writes the response to the console.

The asynchronous version of a function (e.g. BeginRead, BeginWrite, BeginWaitForConnection) is used whenever one is available. All samples implement a simple interface to make running the samples easier:

public interface  IAsyncBase
{
        void StartServer();
        void StartClient();
}
IAsyncBase sample=new TaskSockets();
sample.StartServer();
sample.StartClient();

.NET 1.0 – Delegates

The first option is Delegates, available since good old .NET 1.0. Delegates made asynchronous programming easier than the Windows API by making it easier to use callbacks and removing some explicit thread handling concerns. Unfortunately, the resulting code is a bit verbose

   public class AsyncPipes : IAsyncBase
   {
       protected NamedPipeServerStream serverPipe = new NamedPipeServerStream("MyName", PipeDirection.InOut, 100, PipeTransmissionMode.Message, PipeOptions.Asynchronous);
       protected NamedPipeClientStream clientPipe = new NamedPipeClientStream(".", "MyName", PipeDirection.InOut, PipeOptions.Asynchronous);
       byte[] clientBuffer = new byte[20];
       byte[] serverBuffer = new byte[20];

       public void StartServer()
       {
           serverPipe.BeginWaitForConnection(AfterServerConnect,null);
       }

       public void StartClient()
       {
           clientPipe.Connect();
           var output = Encoding.UTF8.GetBytes("5");
           clientPipe.BeginWrite(output, 0, output.Length, AfterClientWrite, null);
       }

       public void AfterClientWrite(IAsyncResult a)
       {
           clientPipe.EndWrite(a);
           clientPipe.BeginRead(clientBuffer, 0, clientBuffer.Length, AfterClientRead, null);
       }

       public void AfterServerConnect(IAsyncResult a)
       {
           serverPipe.EndWaitForConnection(a);
           serverPipe.BeginRead(serverBuffer, 0, serverBuffer.Length, AfterServerRead, null);
       }

       public void AfterServerRead(IAsyncResult b)
       {
           int count = serverPipe.EndRead(b);
           var input = Encoding.UTF8.GetString(serverBuffer, 0, count);
           string message = String.Format("You sent {0}", input);
           byte[] messageBytes = Encoding.UTF8.GetBytes(message);
           serverPipe.BeginWrite(messageBytes, 0, messageBytes.Length, AfterServerWrite, null);
       }

       public void AfterClientRead(IAsyncResult b)
       {
           int count = clientPipe.EndRead(b);
           string message = Encoding.UTF8.GetString(clientBuffer, 0, count);
           Console.WriteLine(message);
       }

       public void AfterServerWrite(IAsyncResult c)
       {
           serverPipe.EndWrite(c);
       }


   }

 

The code is long and hard to follow. A separate function is needed for each callback so the simple logic of the two endpoints (Connect, Read, Response) get scattered all over the class.

.NET 3.0 – Lambdas

Lambdas made things quite a bit easier in .NET 3.0  by allowing us to embed the callbacks:

 
public class AsyncPipesLambda : IAsyncBase
{
    protected NamedPipeServerStream serverPipe = new NamedPipeServerStream("MyName", PipeDirection.InOut, 100, PipeTransmissionMode.Message, PipeOptions.Asynchronous);
    protected NamedPipeClientStream clientPipe = new NamedPipeClientStream(".", "MyName", PipeDirection.InOut, PipeOptions.Asynchronous);
    byte[] clientBuffer = new byte[20];
    byte[] serverBuffer = new byte[20];

    public void StartServer()
    {
        serverPipe.BeginWaitForConnection(a =>
        {
            serverPipe.EndWaitForConnection(a);
            serverPipe.BeginRead(serverBuffer, 0, serverBuffer.Length, b =>
                    {
                        int count = serverPipe.EndRead(b);
                        var input = Encoding.UTF8.GetString(serverBuffer, 0, count);
                        string message = String.Format("You sent {0}", input);
                        byte[] messageBytes = Encoding.UTF8.GetBytes(message);
                        serverPipe.BeginWrite(messageBytes, 0, messageBytes.Length,
                            c =>serverPipe.EndWrite(c), null);
                    }, null);
        }, null);
    }

    public void StartClient()
    {
        clientPipe.Connect();
        var output = Encoding.UTF8.GetBytes("5");
        clientPipe.BeginWrite(output, 0, output.Length, a =>
        {
            clientPipe.EndWrite(a);
            clientPipe.BeginRead(clientBuffer, 0, clientBuffer.Length, b =>
            {
                int count = clientPipe.EndRead(b);
                string message = Encoding.UTF8.GetString(clientBuffer, 0, count);
                Console.WriteLine(message);
            }, null);
        }
        , null);
    }
}

The code is now more concise, easier to read and write. The logic of the client and server are now clear. We still have to nest one callback into the other and that makes it hard to compose multiple asyncrhonous steps in one method.

.NET 4.0 – Task Parallel Library

That changes with .NET 4.0 and the Task Parallel Library which offers primitives for asynchronous programming as well as the parallel execution of tasks. Each asynchronous step can now be mapped to a task using the FromAsync method. Continuation from one step to another is expressed using ContinueWith. In fact, it is now possible to create extension methods for each asynchronous stream operation that make the asynchronous code look a lot more like normal code. The WriteAsync, ReadAsync methods in the following class come from the the ParallelExtensionsExtras library.

 

public class TaskPipes:IAsyncBase
{
    protected NamedPipeServerStream serverPipe = new NamedPipeServerStream("MyName", PipeDirection.InOut, 100, PipeTransmissionMode.Message, PipeOptions.Asynchronous);
    protected NamedPipeClientStream clientPipe = new NamedPipeClientStream(".", "MyName", PipeDirection.InOut, PipeOptions.Asynchronous);
    byte[] clientInput = new byte[20];

    public void StartServer()
    {            
        Task.Factory.FromAsync(serverPipe.BeginWaitForConnection, serverPipe.EndWaitForConnection, serverPipe).
            ContinueWith(t =>
            {
                byte[] serverInput = new byte[20];
                serverPipe.ReadAsync(serverInput, 0, serverInput.Length)
                    .ContinueWith(rt =>
                    {
                        var input = Encoding.UTF8.GetString(serverInput, 0, rt.Result);
                        string message = String.Format("You sent {0}", input);
                        byte[] messageBytes = Encoding.UTF8.GetBytes(message);
                        serverPipe.WriteAsync(messageBytes, 0, messageBytes.Length);
                    });
            }
        );

    }


    public void StartClient()
    {
        Task.Factory.StartNew(() =>
        {
            clientPipe.Connect();
            var output = Encoding.UTF8.GetBytes("5");
            clientPipe.WriteAsync(output, 0, output.Length)
            .ContinueWith(_ => clientPipe.ReadAsync(clientInput,0,clientInput.Length)
                .ContinueWith(t=>
                {
                    string message = Encoding.UTF8.GetString(clientInput, 0, t.Result);
                    Console.WriteLine(message);
                }));
        });
    }

}

The TPL and ParallelExtensionsExtras offer more than just eliminating the Begin/EndXXX methods. It is now possible to create iterators over a step of tasks and even use LINQ on them. ParallelExtensionsExtras include a set of extension methods for streams (ReadAllBytesAsync, CopyStreamToStreamAsync etc) that make working with streams a lot easier by using iterators over tasks.

Using sockets instead of pipes is just as easy. Instead of a server and a client pipe, we use a TcpListener and TcpClient objects.:

 

 

public class TaskSockets:IAsyncBase
{
    public const int PORT = 10901;
    
    protected TcpListener server = new TcpListener(IPAddress.Any ,PORT);
    protected TcpClient client = new TcpClient();
    byte[] clientInput = new byte[20];

    public void StartServer()
    {            
        server.Start();
        Task.Factory.FromAsync<TcpClient>(server.BeginAcceptTcpClient, server.EndAcceptTcpClient, server).
            ContinueWith(t =>
            {
                byte[] serverInput = new byte[20];
                var stream=t.Result.GetStream();
                stream.ReadAsync(serverInput, 0, serverInput.Length) 
                    .ContinueWith(rt =>
                    {
                        var input = Encoding.UTF8.GetString(serverInput, 0, rt.Result);
                        string message = String.Format("You sent {0}", input);
                        byte[] messageBytes = Encoding.UTF8.GetBytes(message);
                        stream.WriteAsync(messageBytes, 0, messageBytes.Length);
                    });
            }
        );

    }


    public void StartClient()
    {
        Task.Factory.FromAsync(client.BeginConnect,client.EndConnect,"localhost",PORT,null)
            .ContinueWith(c=>
        {                
            var output = Encoding.UTF8.GetBytes("5");
            var stream = client.GetStream();
            stream.WriteAsync(output, 0, output.Length)
            .ContinueWith(_ => stream.ReadAsync(clientInput,0,clientInput.Length)
            .ContinueWith(t=>
                {
                    string message = Encoding.UTF8.GetString(clientInput, 0, t.Result);
                    Console.WriteLine(message);
                }));
        });
    }

}

Still in research – Reactive Extensions


The latest evolution in asynchronous programming comes with the Reactive Extensions for .NET which allow us to write asynchronous programs almost as if we were writing old-fashioned synchronous programs:

public class ReactivePipes:IAsyncBase
{
    protected NamedPipeServerStream serverPipe = new NamedPipeServerStream("MyName", PipeDirection.InOut, 100, PipeTransmissionMode.Message, PipeOptions.Asynchronous);
    protected NamedPipeClientStream clientPipe = new NamedPipeClientStream(".", "MyName", PipeDirection.InOut, PipeOptions.Asynchronous);

    public void StartServer()
    {           
        
        var connect = Observable.FromAsyncPattern(serverPipe.BeginWaitForConnection, serverPipe.EndWaitForConnection);
        var read=Observable.FromAsyncPattern<byte[],int,int,int>(serverPipe.BeginRead,serverPipe.EndRead);
        var write=Observable.FromAsyncPattern<byte[],int,int>(serverPipe.BeginWrite,serverPipe.EndWrite);
        
        connect().Subscribe(u =>
            {
                byte[] msg = new byte[20];
                read(msg, 0, msg.Length).Subscribe(i =>
                {
                    var input =Encoding.UTF8.GetString(msg, 0,i);
                    string message =String.Format("You sent {0}",input);
                    byte[] buffer =Encoding.UTF8.GetBytes(message);
                    write(buffer, 0, buffer.Length);
                });
            });
    }


    public void StartClient()
    {
        var read = Observable.FromAsyncPattern<byte[], int, int, int>(clientPipe.BeginRead, serverPipe.EndRead);
        var write = Observable.FromAsyncPattern<byte[], int, int>(clientPipe.BeginWrite, serverPipe.EndWrite);

        clientPipe.Connect();
        var output = Encoding.UTF8.GetBytes("5");
        write(output, 0, output.Length).Subscribe(u =>
        {
            byte[] bytes = new byte[20];
            read(bytes, 0, bytes.Length).Subscribe(i =>
            {
                string message = Encoding.UTF8.GetString(bytes, 0, i);
                Console.WriteLine(message);
            });
        });
    }
}

 

This time, instead of defining a delegate or a task, we define an Observable (a source of events) and subscribe to it, passing the code we want executed when the event we want to observe occurs. In this case, we subscribe to the EndWrite, EndRead etc. events.

From a scattered mess of callbacks to code that almost reads like a normal function. Not bad, not bad at all.

P.S. Visual Studio has excellent support for parallel debugging, developed by a Greek guy and his team. So be sure to visit Daniel Moth's blog and thank him for an excellent job.

Δημοσιεύτηκε στις από το μέλος Παναγιώτης Καναβός | 0 σχόλια

How not to get a month's name

I've had a genuine TheDailyWTF moment earlier today, when I found the C# equivalent of this VB6 code that returns a three letter month name:

DateTime time = DateTime.Now;
string foldername = time.Year.ToString() + "_" + GetMonthName(time.Month);


private static string GetMonthName(int month)
{
    switch (month)
    {
        case 1:
            return "Jan";

        case 2:
            return "Feb";

        case 3:
            return "Mar";

        case 4:
            return "Apr";

        case 5:
            return "May";

        case 6:
             return "Jun";

        case 7:
             return "Jul";

        case 8:
             return "Aug";

        case 9:
             return "Sep";

        case 10:
             return "Oct";

        case 11:
             return "Nov";

        case 12:
             return "Dec";
    }
    return month.ToString();
}

This code runs on a server with non-English regional settings. My guess is the coder wanted to ensure that (s)he would get back the english month name even if the thread run using non-English regional settings.

The .NET Framework has already solved this problem by accepting an provider parameter in the String.Format and DateTime.ToString() methods. The provider parameter can accept any CultureInfo object including the CultureInfo.InvariantCulture culture which guarantees that string conversions will be locale-agnostic (actually, they are the en-US locale settings). Getting the folder name is as simple as this:

DateTime time = DateTime.Now;
string name = time.ToString("yyyy_MMM", CultureInfo.InvariantCulture);
Δημοσιεύτηκε στις από το μέλος Παναγιώτης Καναβός | 0 σχόλια
Δημοσίευση στην κατηγορία:

Totally wrong way to filter a combo box

From the same legacy code as this post comes this code that loads a combo box with ListItems and then removes unwanted items in a way that is guaranteed to cause an exception when the indexer reaches the middle of the list, which now has only half the initial items:

ddlItems.Items.Clear();

// Re-load so we can filter the required
UIHelper.FillListBoxFromSharePointList(ddlItems, Microsoft.SharePoint.SPContext.Current.Web, "Items");

#region Remove all items from list that are not allowed for selection by the current user
int listCount = ddlItems.Items.Count;

for (int i = 0; i < listCount; i++)
{
    try
    {
        ListItem item = ddlItems.Items[ i];

        string roleId = item.Value;
		
        if (roleId.Length > 0)
        {
            roleId = string.Concat(";", roleId, ";");

	   if (!allowedRolesStringList.Contains(roleId))
            {
                ddlItems.Items.Remove(item);
                i = i - 1;  // Skip back 1 index - since we deleted one.
            }
        }
    }
    catch { ; }
}

#endregion

Obviously the coder noticed the problem and instead of fixing it, he covered it under the catch{;} carpet. The immediate problem could be fixed of course just by reversing the direction of the iteration

ddlItems.Items.Clear();

// Re-load so we can filter the required
UIHelper.FillListBoxFromSharePointList(ddlItems, Microsoft.SharePoint.SPContext.Current.Web, "Items");

#region Remove all items from list that are not allowed for selection by the current user
int listCount = ddlItems.Items.Count;

for (int i = listCount-1; i >= 0; i--)
{
        ListItem item = ddlItems.ItemsIdea;
        string roleId = item.Value;
		
        if (roleId.Length > 0)
        {
           roleId = string.Concat(";", roleId, ";");
           if (!allowedRolesStringList.Contains(roleId))
            {
                ddlItems.Items.Remove(item);
            }
        }
}

#endregion

Of course the real solution would be to filter the collection of list items before loading them in the combobox. Or, better yet, bind them to the list of desired items instead of adding the list items one by one. Unfortunately, the code that loads the ListItems and adds them to the combo is lost and recoverable only through Reflector ....

P.S. Notice the #region blocks inside the code. You know your methods are too big when you have to break them apart using #region . It is infinetely better just to break the regions out into their own methods instead of creating this (un)maintenability nightmare.

Δημοσιεύτηκε στις από το μέλος Παναγιώτης Καναβός | 0 σχόλια
Δημοσίευση στην κατηγορία: , , ,

Totally wrong way to use SPSite, SPWeb

Almost all Sharepoint developers know that you should dispose SPSite and SPWeb object you create in code. Doing so is not hard, you just enclose your objects in a using statement. I found the following piece of code in dozens of places in some legacy Sharepoint code created by people who should have known better:

SPSite site = null;
SPWeb web = null;

try
{

    SPSecurity.RunWithElevatedPrivileges(delegate()
    {

        site = new SPSite(Microsoft.SharePoint.SPContext.Current.Web.Url);
        web = site.OpenWeb();

        //Blah Blah Blah
    });
}
catch (Exception MyMethodException)
{
    WriteErrorMessage(string.Format("MyMethodExceptionException : {0}", MyMethodExceptionException.Message));
}
finally
{
    web.Dispose();
    site.Dispose();
}

Looks like whoever wrote this code never heard of the using statement, in .NET since v1. It's not just that this code is more verbose than it needs to be. It also broadens the scope of the site and web variables and exposes them to abuse. This can be a serious problem if the code contains large methods, and this code has several 100+ line method.
Worse, it's easy to get it wrong by forgetting to put the finally clause. Indeed, there were dozens more places where the coder forgot the finally statement and wrote something like this:

SPSite site = null;
SPWeb web = null;

try
{
    SPSecurity.RunWithElevatedPrivileges(delegate()
    {
        site = new SPSite(Microsoft.SharePoint.SPContext.Current.Web.Url);
        web = site.OpenWeb();
      //Blah Blah Blah
   
     site.Dispose();
        web.Dispose();
    });


}
catch (Exception MyMethodException)
{
    WriteErrorMessage(string.Format("MyMethodExceptionException : {0}", MyMethodExceptionException.Message));
}

Ouch! Guaranteed memory leak in case of exception! And this is how the code would look if using was used:

try
{
    SPSecurity.RunWithElevatedPrivileges(delegate()
    {
        (SPSite site = new SPSite(Microsoft.SharePoint.SPContext.Current.Web.Url))
        (SPWeb web = site.OpenWeb())
        {

		   //Blah Blah Blah
        }
    });
}
catch (Exception MyMethodException)
{
    WriteErrorMessage(string.Format("MyMethodExceptionException : {0}", MyMethodExceptionException.Message));
}
Δημοσιεύτηκε στις από το μέλος Παναγιώτης Καναβός | 0 σχόλια
Δημοσίευση στην κατηγορία: , , ,

Launch PARTY για το Visual Studio 2010

Αυτή τη φορά το Launch του Visual Studio δεν θα είναι ημερίδα με ομιλίες αλλά ... πάρτυ Παρασκευή βράδυ με πίτσα και μπύρες! Γραφτείτε όσο προλαβαίνετε!

Δημοσιεύτηκε στις από το μέλος Παναγιώτης Καναβός | 0 σχόλια

Overheard: Teched Europe 2010 to take place at Berlin

No info on dates though. Strange, that 3 months after Teched Europe 2009 there is no official announcement for Teched Europe 2010.
Δημοσιεύτηκε στις από το μέλος Παναγιώτης Καναβός | 0 σχόλια
Δημοσίευση στην κατηγορία: ,

How to load the Greek stemmer and word breaker for SQL Server Full Text Search

Ever wondered how you can use full text search for Greek text in SQL Server? Out of the box SQL Server doesn't provide any stemmers or word breakers for Greek, which makes FTS work similar to a simple LIKE search. Fortunately, the same binary interfaces are used across all Microsoft products which means that you can use the stemmers and word breakers from other products to enable FTS in SQL Server - as long as you have the license for them!

As a technical excercise, you can use the Greek stemmer and word breaker from Sharepoint Server, which are described in KB929912. All you have to do is to add the appropriate registry entries to SQL Server. The generic process for adding new word breakers and stemmers to SQL Server is described in How To: Load Licensed Third Party Word Breakers . The steps are as follows:

  1. Copy the grclr.dll, grste.dll, grcste.lex files from the C:\PROGRAM FILES\MICROSOFT OFFICE SERVERS\12.0\Bin\ folder to C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\Binn folder. SQL Server will look into this folder to locate stemmers and word breakers.
  2. Add the proper registry entries for Locale, WBreakerClass and StemmerClass. For convenience you can create a registry file with the proper values:
    Windows Registry Editor Version 5.00
    
    [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSearch\CLSID]
    
    [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSearch\CLSID\{1FB980F8-1764-4920-B8E5-89E341205B4A}]
    "Default"="grclr.dll"
    
    [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSearch\CLSID\{3E9A499D-1A5C-4ca8-B948-C5D18DC466B1}]
    "Default"="grcste.dll"
    
    [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSearch\Language]
    
    [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSearch\Language\grc]
    "TsaurusFile"="tsgrc.xml"
    "Locale"=dword:00000408
    "WBreakerClass"="{1FB980F8-1764-4920-B8E5-89E341205B4A}"
    "StemmerClass"="{3E9A499D-1A5C-4ca8-B948-C5D18DC466B1}"
  3. Execute exec sp_fulltext_service 'update_languages'; in SQL Server Management Studio to load Greek FTS support.
  4. Execute 
    select * from sys.fulltext_languages where lcid=1032
    to verify that Greek is actually detected.
    Note that you may have to restart the SQL Server Filter Daemon from Services before SQL Server actually starts using Greek stemming.

After the three steps above, you can start using the CONTAINS and FREETEXT T-SQL functions to search text and files in Greek. Running

select * from dbo.testwords where freetext([words],N'Τρέχω')

will return

ID, Words
--  ---------
10, Έτρεξα
11, Έτρεχα
15, Τρέχει
16, Τρέχεις
17, Τρέχω

WARNING: This article describes just a technical excercise! I do not know if the Greek stemmer can actually be used outside a Sharepoint installation. You should contact Microsoft Hellas to clarify whether and how you can use the stemmer.

And to think that when the Greek community asked in MS Connect for Greek FTS support  we were told that there were "no immediate plans to support Greek FTS"! When it was already available for over a year already!

Δημοσιεύτηκε στις από το μέλος Παναγιώτης Καναβός | 0 σχόλια
Δημοσίευση στην κατηγορία: , ,

Visual Studio 2010 and .NET Framework 4 Launch Date

In a very short blog post Rob Caron just announced that Visual Studio 2010 will launch on Monday, April 12 2010.

Just waiting now for the Sharepoint launch date!

Δημοσιεύτηκε στις από το μέλος Παναγιώτης Καναβός | 0 σχόλια

I just LOVE MaxiVista!

I found MaxiVista through Scott Hanselman's post on multi-monitor productivity. I installed it a few days ago and I LOVE it already. It's not just that I can use my old laptop as a second monitor for my new one. It's not only working with Visual Studio 2010 spread across two screens. Or having TweetDeck on the second screen (oops, laptop) with VS 2008 on the primary screen (ooooops Laptop, laptop, laptop!). Multi-monitor debugging, side by side comparison of requirement documents and big excel sheets. Running VS on the primary, Team Explorer on the secondary screen.

It is that MaxiVista allows you to switch between the second laptop's desktop and the primary's extended desktop just by pressing a key and still use the primary's keyboard and mouse. The clipboard also works across both laptops. If this isn't the easiest way to remotely control another PC, I don't know what is!

If only I could do the same with Remote Desktop connections, I would be really happy!

Δημοσιεύτηκε στις από το μέλος Παναγιώτης Καναβός | 0 σχόλια

If Product X was a motorcycle ...

... it would probably be a 1948 Indian Warrior. For those not into motorcycle history, the Indian Motorcycle Company used to be the largest motorcycle manufacturer before WWII with a fanatical following. The company failed to keep up with the times.  Outdated tooling, poor development and quality,  underpowered engines alienated even the greatest fans while its greatest competitor, Harley Davidson, won the hearts of riders. The Indian Motrocyle Company folded in 1953.

Δημοσιεύτηκε στις από το μέλος Παναγιώτης Καναβός | 0 σχόλια

SharePoint v. Next predictions

Information on upcoming changes SharePoint versions, whether they are full versions or service packs, is a matter of life and death for us developers and administrators that don't want to face mildly disconcerting surprises. Yet finding information on upcoming versions of SharePoint has been somewhat challenging historically - to say the least.

With SharePoint 2010 coming sometime late this year or early next year, a lot of people have started making informed guesses on changes and new features. Some of the guesses are based on announcements and conversations in conferences, some are based on slips by Microsoft employees, some on the semi-official warnings of SharePoint SP2's upgrade checker. Here are some of my guesses, including their sources.

  • Business Data Catalog will use a provider model to communicate with other systems. The current version requires a complicated XML configuration file with to communicate with other systems. The complexity of this file and lack of tooling means that it is often easier to create an intermediate web service to translate other systems' web service APIs to an API that is easy to work with BDC.
    Source: Slip by Microsoft presenter. During a presentation on the current version of BDC he said that BDC does have a provider model. The only reason to make such a mistake is that he mixed up current and next version features.
  • Surveys will allow you to create truly anonymous surveys. The current version keeps the username for each answer and displays * for anonymous surveys. Once you change the survey from anonymous to normal you can see all names.
    Source: Slip by Microsoft presenter. This is a bit unreliable though, as he probably didn't know how easy it is to find the names in an anonymous survey today.
  • Unmanaged core code will be replaced by managed code. The current version at its core uses COM+ and unmanaged code that requires explicitly releasing (or disposing in .NET speak) objects. This is a bit tricky and easy to forget, especially when you are coding against SharePoint's .NET API. This causes numerous resource leaks, not only in client code, but also in SharePoint's own .NET code.
    Source: Observations of other Office Server products. The latest versions of Project Server and Commerce Server moved from a mixture of .NET over COM+ core to full .NET code. It's time the product they depend on made this move as well.
  • Solution-based deployment will change - but I have no idea how. The current version of SharePoint is essentially un-cloudable. Scaling out is too difficult and costly. It requires not only installing SharePoint on a new server but also all custom code that was already installed on other machines. Provided, that is, that you are using domain accounts everywhere, including IIS anonymous accounts. Otherwise moving from a single WFE farm to multiple WFEs can be quite a bit trickier.
    Source: Observation on Microsoft's current push to the cloud. The current model requires hours if not days to provision a new server. The cloud demands minutes. I just hope the changes MS makes to accommodate SharePoint Online find their way to the SKUs as well.
  • LINQ to SharePoint, finally. But will it allow you to do everything CAML does, e.g. search the entire site hierarchy for items of a specific content type?
    Source: Conversations with Microsoft employees at Teched.
  • Maybe, the solution format will change as well. I'm just expressing a hope here. The current format is a barely documented, inconsistent nightmare. Tools like WSPBuilder make life a lot easier and Microsoft's own Visual Studio Extensions for WSS are finally catching up to WSPBuilder. Still, the solution format is a poor substitute for a proper Setup project.

Finally, a plea to Microsoft. Microsoft, please, PLEASE, don’t wait for the last moment to release SharePoint’s beta! The next version will have many changes and the last thing anyone wants is for major bugs to appear in the final product because there was no time to fix them.

Releasing betas early is absolutely essential as the development iterations for SharePoint related products seem to be rather long. It took almost two years to fix attachments in custom list forms. QA seems to have issues as well, as the trial activation bug in SharePoint Service Pack 2 demonstrates.

One thing is certain. A lot of people are willing to test and submit bugs for SharePoint. The SharePoint Connect site is flooded with bug reports, even though it was only activated last November to collect feedback on VSeWSS 1.3. There was no way for those bug reports to reach Microsoft before November. The only real alternative was to waste a support incident simply to report a bug report that would probably not get fixed for another year. Needless to say, many customers and partners chose not to report the bugs at all.

So PLEASE Microsoft, LET US HELP YOU!

Δημοσιεύτηκε στις από το μέλος Παναγιώτης Καναβός | 0 σχόλια
Δημοσίευση στην κατηγορία:

A pictorial comparison of Enterprise Service Bus products

... no cars this time!Smile

Enterprise Application Integration products

Your services WILL be assimilated

Commercial Enterprise Service Bus products

They connect with everything. For a cost.

Open Source Service Bus

Small, Fast, To the point. And FUN!

EAI products have given ESBs and SOA such a bad name that led some to proclaim that SOA is dead. They don't try to integrate everything and translate everything. They are good at what they do but you do have to adapt to their reality. ESBs like NServiceBus or Rhino Service Bus are small, fast and fun. They adapt to your architecture instead of the other way around.

Δημοσιεύτηκε στις από το μέλος Παναγιώτης Καναβός | 0 σχόλια

If ORMs were cars ...

Then Linq to SQL would be the Smart ForTwo. A seriously fun and easy way to get around town as long as you don't want to carry more than two passengers and maybe a suitcase.

NHibernate would be the HMMWV. Works great out-of-the-box if you are willing to give up some creature comforts. There is just no way you would drive this around town, but it can go pretty much everywhere. What's more, you can adapt it to a thousand different configurations with the proper kit. What's more, it has a passionate following of HMMWV afficionados that can help you get over any problems you may have.
Be careful though. You had better know what you are doing before you take one of these out for a ride ...

Finally, Entity Framework v1 is the Ford F450. In the body-and-chassis version.

Creature comforts, ease of driving, but you have to add some parts yourself if you want to use it for work. The good thing is that you can certainly do that. It's just that, unlike the Ford F450, there aren't any pre-built packages available for EF right now. Luckily, EF v2 promises not only to provide pre-built packages but it will also come in several pre-built models. That will certainly ease some of the adoption pain.

 

UPDATE: Ayende thinks that NHibernate would actually be this  ...

The Hummer H2 Limo.

Δημοσιεύτηκε στις από το μέλος Παναγιώτης Καναβός | 0 σχόλια

new CloudApp() - The International Azure™ Services Platform Developer Challenge

Speaking of Azure ...

The international Azure developer challenge new CloudApp() opened on June 1. The submission deadline is July 9th. The winner will be chosed by community voting between July 10 and 20. The winner will be announced on July 21.

Gentlemen, Start you Editors!

Δημοσιεύτηκε στις από το μέλος Παναγιώτης Καναβός | 0 σχόλια
Δημοσίευση στην κατηγορία: ,

If Azure Table Storage and SQL Data Services were cars ...

SQL Data Services would be this.

The Porsche. An engineering marvel yet beatiful and comfortable. In the hands of an average driver, it goes fast. In the hands of an experienced driver, it goes VERY fast. Its electronic systems will forgive many of the driver's mistakes yet provide the maximum available power when needed. This car will turn heads wherever it goes.

And then there is this.

Friendly, it is not. Unless you know how to drive it, you will stall a dozen times before you even get on the track. It's not easy to control either. Comfort is a matter of discussion, once you account for the fact that you have to take the steering wheel off just to get in the car. It's beauty is debateable as everything is sacrificed in order to achieve the best aerodynamic shape.

Yet, in the hands of an experienced race driver,  it is fast. INSANELY fast! This car exists for one thing and one thing only, and that is SPEED.

And that's what the Azure Table Storage Service is all about.

Δημοσιεύτηκε στις από το μέλος Παναγιώτης Καναβός | 0 σχόλια
Δημοσίευση στην κατηγορία: , ,

Project Server 2007 Pre-Populated Timesheet spookines

Today I encountered a rather intriguing new behaviour in Project Server 2007 timesheets. When a user creates a timesheet Project Server can check for any tasks that fall within the timesheet's dates and add lines for each of the tasks it find. Project Server will even check the % complete field of each task and fill the appropriate work hours for you in the timesheet, which can be a real time-saver.
This is called prepopulation and is controlled by the "Default Timesheed Creation Mode" setting in Timesheet Settings and Defaults.

If you don't like the prepopulated cell values, you can go change them or empty the cells. Before SP2 you could also delete any timesheet lines you didn't like. Once you saved your timesheet, the unwanted lines were gone. When you are assigned to a lot of projects you really want to delete any unwanted lines.

With SP2 though, this behaviour changed in an intriguing way. You can still delete the lines from the UI, but once you save the timesheet the lines are re-created using the prepopulation values. Yet the extra hours do not appear in the timesheet's total time! Should you save the timesheet with the resurrected values, the totals are updated to reflect the prepopulated hours as well. What gives?

Do the lines exist in the database? After some spellunking and SQL Profiling I discovered that the prepopulated-then-deleted lines do not appear in the MSP_TIMESHEET_LINES table of the Published database (that's where the timesheets live). That explains why the 'ghost' lines do not count in any sums. After you save the timesheet though, the lines re-appear in the table.

It would seem that a small bug in the Timesheet ActiveX control displays prepopulated values based on the user's assignments when it can't find any timesheet lines in the database. That's usefull the first time the timesheet is used but can be really annoying when you delete a line and it reappears like a ghost out of the grave. When you save the timesheet, the ActiveX control sends everything down to the server and the 'ghost' lines get resurrected.
Of course, I can't verify this without reverse-engineering the ActiveX, but the hypothesis seems to fit the facts.

Spooky!

A simple workaround is to empty each cell or fill it with 0s instead of actually deleting the lines. Unfortunatelly, some of my users are not so spiritually inclined, what with over 100 tasks assigned to each timesheet ....

UPDATE: Duuuh, the timesheet page doesn't use an ActiveX control, it is a .NET Page using a Timesheet webpart! It's the Project Center page that uses an ActiveX control! Which is good news, because now I can check the timesheet page with Reflector ...

UPDATE 2: Duudaaah, the search ends at the obfuscated Microsoft.Office.Project.PWA.WebParts.TimesheetPart.LoadTimesheetData method ...

UPDATE 3: Looks like this is not a bug, but a "feature" (here, here and in the KB) ??! So that you don't forget to notify your project manager that your task needs to be removed? Then why not disable the Delete button for this timesheet line? Or better yet, why not let the organisation decide whether the PM has to approve the timesheet line deletion by adding a "Delete prepopulated timesheet lines" permission ?

 

Δημοσιεύτηκε στις από το μέλος Παναγιώτης Καναβός | 0 σχόλια
Δημοσίευση στην κατηγορία: , ,

Sharepoint SP2 Installation Peculiarities

By now everyone has heard that the Sharepoint Server Service Pack 2 activates a 180-days trial timeout. Fortunately, this really minor problem (no-one is affected for the next 6 months) can be fixed easily followind the steps described in KB 971620. Or you can wait for the hotfix, which will probably come before the trial expires.

Other problems though, are not so minor. I run in the following problem when installing SP2 on a machine. The installation of the binaries finished without any problems. Then I run the Configuration Wizard which run without any problems - until the final step. That's when it crashed due to a "PostSetupConfigurationTaskException". The wizard dialog box pointed me to a PSCDiagnostics_XXXX.log that contained the rather cryptic line:

Final step in the wizard failed with:
- An exception of type Microsoft.SharePoint.PostSetupConfiguration.PostSetupConfigurationTaskException was thrown.  Additional exception information: Failed to upgrade SharePoint Products and Technologies

Obviously. Then I looked at the Upgrade.log in the Logs directory of the 12 hive.

[AssemblyReferenceFixUp] [3.0.4.0] [ERROR] [5/4/2009 12:07:14 PM]: Application Web Config for this IIS site (772645167) could not be found at C:\inetpub\wwwroot\wss\VirtualDirectories\36043\web.config.
[SPIisWebSiteWssSequence] [ERROR] [5/4/2009 12:07:14 PM]: Action 3.0.4.0 of Microsoft.SharePoint.Upgrade.SPIisWebSiteWssSequence failed.
[SPIisWebSiteWssSequence] [ERROR] [5/4/2009 12:07:14 PM]: Object reference not set to an instance of an object.
[SPIisWebSiteWssSequence] [ERROR] [5/4/2009 12:07:14 PM]:    at Microsoft.SharePoint.Upgrade.AssemblyReferenceFixUp.Upgrade()
   at Microsoft.SharePoint.Upgrade.SPActionSequence.Upgrade()

That's more like it. The configuration wizard found a site collection without a corresponding web.config. Checking the virtual directory and the site collection I found that it was an orphaned site collection. The site collection still appeard in the list of site collections, even though most of the files in the virtual directory were missing. The content database appeared empty in the Content Databasaes page, even though it contained data for multiple sites. Why did this happened? I am not certain. Perhaps a site collection delete job failed to finish. Perhaps there was some other problem.

The real problem though was that the code in SPIisWebSiteWssSequence failed to handle the error condition. The code in AssemblyReferenceFixUp correctly detected the missing file and probably return a null value instead of a reference to the config file. The code in SPIisWebSiteWssSequence though didn't check for null so tried to use the web.config file and crashed.

And then the error information was lost.

Obviously, the two modules, AssemblyReferenceFixUp  and SPIisWebSiteWssSequence , use different logging mechanisms. One of them uses the PSCDiagnostics log while the other uses the Upgrade log. The wizard only checks the PSCDiagnostics log (or the mechanism that writes to it) so it doesn't know about the error logged in Upgrade.Log.
As a result, the error message was lost and I had to dive into several logs to find it.

There are many ways to avoid such bugs. One is of course to check for nulls. The other is to use the same logging mechanism in all modules of an application. That's where logging libraries like Microsoft's own Enterprise Library or the open source log4net can really help.

Have fun!

Δημοσιεύτηκε στις από το μέλος Παναγιώτης Καναβός | 0 σχόλια
Δημοσίευση στην κατηγορία: ,

I wish people stopped marketing Cloud Computing as the poor man's application hosting!

It's like getting Herman Tilke to design and build a brand new Grand Prix circuit and then market it as a nice place to test drive sports cars.

Δημοσιεύτηκε στις από το μέλος Παναγιώτης Καναβός | 0 σχόλια

Project Server Curios: The QueueDeleteArchivedProject method

I encountered a rather interesting bug in the Project Server API a few days ago as I tried to create a utility to clean-up the Archive database of a few thousand old project versions. The proper way to do this is to call the Archive.QueueDeleteArchivedProject method. Unfortunately, the method has a few problems.
  1. An unknown parameter, archiveID. There is no such parameter anywhere in the Archive web service API.
  2. Passing a Project GUID to the projectUID didn't delete any archived versions.
  3. Passing ac Project GUID to the archiveID DID delete the versions.
After some spellunking with Reflector, I found out that:
  1. The archiveID parameter actually refers to the PROJ_VERSION_UID column of the ArchivedProjects.Project Datatable.
  2. The parameters were actually in the wrong order. Somewhere deep inside the web service's implementation, in the Microsoft.Office.Project.Server.BusinessLayer.Admin.QueueDeleteArchiveProjectInternal method, a ProjectDeleteParamsMessage object was constructed with the parameters in the wrong order. The projectUID parameter was passed to the projectVersionUID parameter and vice-versa.
My spellunking adventure left me wondering a few things:
  • If this method doesn't work as documented, how does the Project Server UI delete old project versions?
    One possible answer is that the UI doesn't use this version. In many cases, Project Server uses the almost-totally-undocumented Admin service. It seems that this is just such a case.
  • How come no-one noticed this before? Well, at least one other developer asked about this issue a year ago but received no valid answer.
  • What do I do now? I can call the function with reversed parameters, but what happens when MS fixes the bugs?
    How will they fix it?
    Will they reverse the parameters? That would break any code that works with the current version of the method.
    Will the rename the parameters? That would be confusing, as other methods, like QueueRestoreProject have the same parameters in the current order. Additionally, preserving the current order wouldn't fix the bug in the internals of the Archive web service.
  • Where do I report the bug? There is no Project Server Connect site, at least none that is listed in the Connect Site. Searching for Project Server reveals that there is indeed a Project Server site but when I try to access it I get a "Page not Found" error.

UPDATE:
I just found out that the Archive.ReadArchivedProjectsList is also affected and returns a project's UID in the PROJ_VERSION_UID column and the version UID in the PROJ_UID column.  It would seem that the confusion between Project and Project Version UIDs goes all the way to the Archive database. The MSP_Projects uses a PROJ_UID column as key that holds the Project Version UID instead of a project's actual UID.

This means that any code that wants to work with a specific project's versions has to search in the PROJ_VERSION_UID column for the project's UID.

Δημοσιεύτηκε στις από το μέλος Παναγιώτης Καναβός | 0 σχόλια
Δημοσίευση στην κατηγορία:

SportTime.gr uses Microsoft Logos?

Go to http://www.sportime.gr/ and scroll to the lower right of the page. Look at the blue-and-white MVP logo. Now go to http://mvp.support.microsoft.com/ and check the logo on the top of the page. Coincidence?

If so, this could be a very unfortunate coincidence. Like all companies, Microsoft is very particular about the usage of its logos and protects them assidously. Someone should warn SportTime but unfortunately, there is no contact information on the site.

Kudos to papadi for finding this out.

Δημοσιεύτηκε στις από το μέλος Παναγιώτης Καναβός | 0 σχόλια

Data Services, Facebook and Beatiful Architecture

I had one of those rare AHA! moments while I was reading Dave Fetterman's chapter from the "Beautiful Architecture" book on Facebook's architecture.  Facebook presents a simplified view of its data to applications (Users, Friends, User Info) and allowes them to query it using its own SQL-like language, FQL. An application's output is rendered in FBML, which is HTML extended with Facebook-specific tags like "tabs" or "put-my-friends-books-here". This allows Facebook both to integrate the application with its own UI and, more importantly, include data like my friend's books, that an application is not normally allowed to query due to privacy concerns.

While reading the chapter I realized that Facebook's approach to exposing its data was similar to Microsoft's ADO.NET Data Services.  ADO.NET Data Services is an attempt to expose an application's data to third-party applications. By using EF, ADO.NET Data Services provides third party applications with a simplified model of the data, just as Facebook does with Users, Friends and User Info. Using REST allows the data to be integrated in third party applications with minimal fuss. All you need to do to expose your application's data to others is create a simplified entity model for it and expose it through REST. You can also define operations on the object model that other applications can call, all inside the simplified entity model.

When seen as an alternative to web services, Data Services looks like a technology with limited applications. Accessing my data through REST, bypassing all my UI or web service logic just seems weird. Do I really want to expose my database to the world? Nobody but me (or my colleagues) can understand the model, so what was the point? And allowing code to call object methods through REST? Is that a recipe for disaster or what? Those were just a few of the criticisms directed at ADO.NET data services.

BUT, when seen from Facebook's perspective Data Services starts to make sense.  Data Services is not about exposing your application's inner working. It's not about caching (although that's an incidental benefit). It's about exposing your data through a limited API to other applications!  

So THAT is what Microsoft is trying to achieve with ADO.NET Data Services!

Δημοσιεύτηκε στις από το μέλος Παναγιώτης Καναβός | 0 σχόλια
Δημοσίευση στην κατηγορία: , ,

"Beautiful Architecture" is now available in print

You can now buy the print edition of "Beautiful Architecture" from O'Reily's site. Amazon still has the book on pre-order.

I've already finished Chapters 1,2,6 and 8 but had no time to blog about them. Up to this point the most interesting is #6, The Architecture of the Facebook Platform by Dave Fetterman. Fetterman describes how Facebook managed to expose its data to external applications, integrate those applications inside Facebook while preserving privacy and security. Notice the emphasis on data, not process or workflows. But more of this on my next post.

Δημοσιεύτηκε στις από το μέλος Παναγιώτης Καναβός | 0 σχόλια
Δημοσίευση στην κατηγορία: ,

Aspiring Software Architect Program for Microsoft Office Sharepoint Server

I just found out about the Aspiring Software Architects program for MOSS through the MSDN Architecture site. ASAP is a series of webcasts covering subject like architecting Internet facing websites, architecting for performance, scalability and availability etc. Attendees can also participate in a MOSS Architecture contest and submit a solution architecture for a Knowledge Management business case.

A bit of warning though. The times mentioned are in India Standard Time (GMT + 5:30) , which means that most Europeans will have trouble attending the live webcast, unless you are willing to get up at 5 am! While one can still download the slides, it would be great if a discussion forum were available for people in Europe. You won't find any references to ASAP on the MS Sharepoint sites or team blogs either.

Δημοσιεύτηκε στις από το μέλος Παναγιώτης Καναβός | 0 σχόλια

Beautiful Architecture Book Digital Edition coming out today!

The digital edition of Diomidis Spinellis’ “Beautiful Architecture” book is expected to go live today. The book includes essays on some of the most interesting large scale system architecures including Facebook and Tandem. I expect everyone knows Facebook by now, but how many people realize that everything they know about transaction processing, including things like two-phase commit and transaction logs started with Tandem?

If the quality of Diomidis’ previous books is any indication, this book promises to be extremely interesting (to say the list). I just can’t wait to get my copy!

I wonder, how many hours before O’Reily activates the purchase link ………

Δημοσιεύτηκε στις από το μέλος Παναγιώτης Καναβός | 0 σχόλια
Περισσότερες Δημοσιεύσεις Επόμενη »