Wooley's LINQ Wonderings

Insights and observations regarding LINQ

  • LINQ in Action now in Chinese

    Linq in Action in ChineseToday, I received an unexpected surprise in the mail. A copy of LINQ in Action translated into Chinese. We were aware that someone was making a Chinese translation, but only expected it to be a couple chapters. It turns out the entire book, including the bonus chapter 14 (LINQ to Datasets) which didn't make the printed English version of the book. Hopefully nothing got lost in translation for this version. If you read Chinese, check the book out and let us know what you Thinq.
  • LINQ to CSV using DynamicObject

    When we wrote LINQ in Action we included a sample of how to simply query against a CSV file using the following LINQ query:

    
    From line In File.ReadAllLines(“books.csv”) 
    Where Not Line.StartsWith(“#”) 
    Let parts = line.Split(“,”c) 
    Select Isbn = parts(0), Title = parts(1), Publisher = parts(3)
    

    While this code does make dealing with CSV easier, it would be nicer if we could refer to our columns as if they were properties where the property name came from the header row in the CSV file, perhaps using syntax like the following:

    
    From line In MyCsvFile
    Select line.Isbn, line.Title, line.Publisher
    

    With strongly typed (compile time) structures, it is challenging to do this when dealing with variable data structures like CSV files. One of the big enhancements that is coming with .Net 4.0 is the inclusion of Dynamic language features, including the new DynamicObject data type. In the past, working with dynamic runtime structures, we were limited to using reflection tricks to access properties that didn't actually exist. The addition of dynamic language constructs offers better ways of dispatching the call request over dynamic types. Let's see what we need to do to expose a CSV row using the new dynamic features in Visual Studio 2010.

    First, let's create an object that will represent each row that we are reading. This class will inherit from the new System.Dynamic.DynamicObject base class. This will set up the base functionality to handle the dynamic dispatching for us. All we need to do is add implementation to tell the object how to fetch values based on a supplied field name. We'll implement this by taking a string representing the current row. We'll split that based on the separator (a comma). We also supply a dictionary containing the field names and their index. Given these two pieces of information, we can override the TryGetMember and TrySetMember to Get and Set the property based on the field name:

    
    Imports System.Dynamic
    
    Public Class DynamicCsv
        Inherits DynamicObject
    
        Private _fieldIndex As Dictionary(Of String, Integer)
        Private _RowValues() As String
    
        Friend Sub New(ByVal currentRow As String,
                       ByVal fieldIndex As Dictionary(Of String, Integer))
            _RowValues = currentRow.Split(","c)
            _fieldIndex = fieldIndex
        End Sub
    
        Public Overrides Function TryGetMember(ByVal binder As GetMemberBinder,
                                               ByRef result As Object) As Boolean
            If _fieldIndex.ContainsKey(binder.Name) Then
                result = _RowValues(_fieldIndex(binder.Name))
                Return True
            End If
            Return False
        End Function
    
        Public Overrides Function TrySetMember(ByVal binder As SetMemberBinder,
                                               ByVal value As Object) As Boolean
            If _fieldIndex.ContainsKey(binder.Name) Then
                _RowValues(_fieldIndex(binder.Name)) = value.ToString
                Return True
            End If
            Return False
        End Function
    End Class
    

    With this in place, now we just need to add a class to handle iterating over the individual rows in our CSV file. As we pointed out in our book, using File.ReadAllLines can be a significant performance bottleneck for large files. Instead we will implement a custom Enumerator. In our customer enumerable, we initialize the process with the GetEnumerator method. This method opens the stream based on the supplied filename. It also sets up our dictionary of field names based on the values in the first row. Because we keep the stream open through the lifetime of this class, we implement IDisposable to clean up the stream.

     

    As we iterate over the results calling MoveNext, we will read each subsequent row and create a DynamicCsv instance object. We return this row as an Object (Dynamic in C#) so that we will be able to consume it as a dynamic type in .Net 4.0. Here's the implementation:

     

    
    Imports System.Collections
    
    Public Class DynamicCsvEnumerator
        Implements IEnumerator(Of Object)
        Implements IEnumerable(Of Object)
    
        Private _FileStream As IO.TextReader
        Private _FieldNames As Dictionary(Of String, Integer)
        Private _CurrentRow As DynamicCsv
        Private _filename As String
    
        Public Sub New(ByVal fileName As String)
            _filename = fileName
        End Sub
    
        Public Function GetEnumerator() As IEnumerator(Of Object) _
            Implements IEnumerable(Of Object).GetEnumerator
    
            _FileStream = New IO.StreamReader(_filename)
            Dim headerRow = _FileStream.ReadLine
            Dim fields = headerRow.Split(","c)
            _FieldNames = New Dictionary(Of String, Integer)
            For i = 0 To fields.Length - 1
                _FieldNames.Add(GetSafeFieldName(fields(i)), i)
            Next
            _CurrentRow = New DynamicCsv(_FileStream.ReadLine, _FieldNames)
    
            Return Me
        End Function
    
        Function GetSafeFieldName(ByVal input As String) As String
            Return input.Replace(" ", "_")
        End Function
    
        Public Function GetEnumerator1() As IEnumerator Implements IEnumerable.GetEnumerator
            Return GetEnumerator()
        End Function
    
        Public ReadOnly Property Current As Object Implements IEnumerator(Of Object).Current
            Get
                Return _CurrentRow
            End Get
        End Property
    
        Public ReadOnly Property Current1 As Object Implements IEnumerator.Current
            Get
                Return Current
            End Get
        End Property
    
        Public Function MoveNext() As Boolean Implements IEnumerator.MoveNext
            Dim line = _FileStream.ReadLine
            If line IsNot Nothing AndAlso line.Length > 0 Then
                _CurrentRow = New DynamicCsv(line, _FieldNames)
                Return True
            Else
                Return False
            End If
        End Function
    
        Public Sub Reset() Implements IEnumerator.Reset
            _FileStream.Close()
            GetEnumerator()
        End Sub
    
    #Region "IDisposable Support"
        Private disposedValue As Boolean ' To detect redundant calls
    
        ' IDisposable
        Protected Overridable Sub Dispose(ByVal disposing As Boolean)
            If Not Me.disposedValue Then
                If disposing Then
                    _FileStream.Dispose()
                End If
                _CurrentRow = Nothing
            End If
            Me.disposedValue = True
        End Sub
    
        ' This code added by Visual Basic to correctly implement the disposable pattern.
        Public Sub Dispose() Implements IDisposable.Dispose
            Dispose(True)
            GC.SuppressFinalize(Me)
        End Sub
    #End Region
    
    End Class
    
    
    Now that we have our custom enumerable, we can consume it using standard dot notation by turning Option Strict Off in Visual Basic or referencing it as a Dynamic type in C#:

    VB:

    
    
    Public Sub OpenCsv()
        Dim data = New DynamicCsvEnumerator("C:\temp\Customers.csv")
        For Each item In data
            TestContext.WriteLine(item.CompanyName & ": " & item.Contact_Name)
        Next
    
    End Sub
    

    C#:

    
    [TestMethod]
    public void OpenCsvSharp()
    {
        var data = new DynamicCsvEnumerator(@"C:\temp\customers.csv");
        foreach (dynamic item in data)
        {
            TestContext.WriteLine(item.CompanyName + ": " + item.Contact_Name);
        }
    }
    

    In addition, since we are exposing this as an IEnumerable, we can use all of the same LINQ operators over our custom class:

    VB:

    
    Dim query = From c In data
                Where c.City = "London"
                Order By c.CompanyName
                Select c.Contact_Name, c.CompanyName
    
    For Each item In query
        TestContext.WriteLine(item.CompanyName & ": " & item.Contact_Name)
    Next
    

    C#:

    
    [TestMethod]
    public void LinqCsvSharp()
    {
        var data = new DynamicCsvEnumerator(@"C:\temp\customers.csv");
        var query = from dynamic c in data 
                    where c.City == "London"
                    orderby c.CompanyName
                    select new { c.Contact_Name, c.CompanyName };
    
        foreach (var item in query)
        {
            TestContext.WriteLine(item.CompanyName + ": " + item.Contact_Name);
        }
    }
    

    Note: This sample makes a couple assumptions about the underlying data and implementation. First, we take an extra step to translate header strings that contain spaces to replace the space with an underscore. While including spaces is legal in the csv header, it isn't legal in VB to say: " MyObject.Some Property With Spaces". Thus we'll manage this by requiring the code to access this property as follows: "MyObject.Some_Property_With_Spaces".

    Second, this implementation doesn't handle strings that contain commas. Typically fields in CSV files that contain commas are wrapped by quotes (subsequently quotes are likewise escaped by double quotes). This implementation does not account for either situation. I purposely did not incorporate those details in order to focus on the use of DynamicObject in this sample. I welcome enhancement suggestions to make this more robust.

    Crossposted from ThinqLinq.com

  • LINQ to SQL supported data types and functions

    When we were writing LINQ in Action, we weren't able to specify all of the possible methods and functions that have supported LINQ to SQL query translations for a couple reasons.

    1. There were too many to be included in the scope of the book.
    2. The book was being written at the same time that LINQ was evolving and more comprehensions were being supported, thus giving us a moving target that we couldn't ensure the accuracy of when the product shipped.
    3. We realized that over time, translations for more functions may be added and enumerating the list in the book might not reflect the current methods supported with a given framework version.

    As I was searching for an answer to a recent question, I happened upon a listing on MSDN showing the functions and methods which are and are not supported. The full list of LINQ to SQL supported and unsupported members is available online at http://msdn.microsoft.com/en-us/library/bb386970.aspx.

    As an example the following methods are shown as having translations for DateTime values: Add, Equals, CompareTo, Date, Day, Month, Year. In contrast methods like ToShortDateString, IsLeapYear, ToUniversalTime are not supported.

    If you need to use one of the unsupported methods, you need to force the results to the client and evaulate them using LINQ to Objects at that point. You can do that using the .AsEnumerable extension method at any point in the query comprehension. Any portion of the query that follows AsEnumerable will be evaluated on the client side.

    Crossposted from http://www.thinqlinq.com/Default/LINQ-supported-data-types-and-functions.aspx

  • Enabling the Expression Tree Visualizer in Visual Studio 2008

    In LINQ in Action, we discuss how to add the LINQ to SQL Query visualizer into the Visual Studio 2008 environment. This tool allows you to open a window during debug time to view the TSQL that is generated from the LINQ expression tree. It also allows you to run the query and view the results. If you're not familiar with it, check out this post by Scott Guthrie.

    In addition to the query visualizer, you can also build and install the Expression Tree visualizer, not only as a separate application, but also as an integrated visualizer within Visual Studio 2008. To do this, download the Linq Samples from MSDN Code gallery. Inside of that, you can find a project for the ExpressionTreeVisualizer.  To use it as a stand alone utility, build and run the ExpressionTreeVisualizersApplication. This is the method most people are familiar with.

    Building the solution will also build the ExpressionTreeVisualizer library. This is the one you need to use to enable it in Visual Studio natively, copy the generated ExpressionTreeVisualizer.dll library and paste it into your ..\Program Files\Microsoft Visual Studio 9.0\Common7\Packages\Debugger\Visualizers directory.

    Once you have placed the library in the visualizers directory, let's see what you can do to use the new visualizer . First, let's build a LINQ to SQL query:

       Dim query = From cust In dc.Customers _
                        Where cust.City = "London" _
                        Order By cust.CompanyName _
                        Select cust
     
    Given this query, we need to access the expression object exposed by the IQueryable query object as follows:
     
            Dim queryExpression = query.Expression
     

    Now, that we have our code set-up, set a breakpoint in your code after you have instantiated this queryExpression variable and debug your project. Now, if you hover over the query.Expression method, you'll see a new magnifying glass as shown below:

     

    Clicking on the visualizer icon, will launch the visualizer tool revealing the following screen:

     

    Sure, there's lots of information in there. The expression trees are quite complex. This tool helps you decipher them in cases where you need to either parse or dynamically create expression trees in your applications.

  • Object Identity tracking changes with LINQ to SQL SP1

    When we wrote LINQ in Action, we took a bit of time to explain how the identity tracking system worked with LINQ to SQL to make sure that changed objects were retained when subsequent queries are requested from a data context. In a nutshell, when you issue a query, the data context translates the LINQ query into TSQL and sends that to the database. The database returns the rowsets to LINQ to SQL. The provider checks the returned rows against those that it is already tracking from previous fetches and, rather than instantiating the object again, returns the object in its internal store. This is done primarily to ensure that changes a user has made in the course of his context's lifetime are retained rather than being overwritten.

    We also discussed (p. 258 if you're following along) how there is a special optimization wherein if you are querying for a single result, the pipeline would check the internal cache first before looking at the database, thus reducing the overhead of repeated hits to the database. An astute reader checked out our claim, and sure enough that optimization did not make it into the RTM bits of VS 2008. We considered fixing this in the second printing, but consulted with the product teams first. It turns out that the intended behavior was indeed to include this optimization, but due to a last minute bug, it didn't make it in.

    As Dinesh points out, this oversight has been fixed in SP1. Now, if you try to fetch a single object (using Single, SingleOrDefault, First, or FirstOrDefault), the in memory object cache will be checked based on the identity columns declared in the entity's structure. If a matching object is found, it will be returned, otherwise the record will be requested from the database.

     

  • Screen scraping and creating Word documents with LINQ to XML

    At TechEd Developers 2008 in Orlando, I had the pleasure of competing in Speaker Idol. In that competition, we had the opportunity to present a topic in 5 minutes. Unfortunately, the topic I choose really needed 10 minutes to cover at the level of detail it needed. Instead of limiting the topic, I decided to go ahead and present it a bit too fast.

    If you want to see the video, or see how to use VB 9's XML Literals and LINQ to XML to fetch data from a web page (that must be XHtml compilant), manipulate it and insert it into a Word 2007 file, it is now available on the  Developer Landing page, and the Library page. of the TechEd site. If you prefer, you can jump right to the video in either WMV or MP4 file formats. If you're not familiar with LINQ to XML, go ahead and download the video and just watch it at half speed ;-)

  • LINQ to SQL's support for POCO

    One of the strengths that LINQ to SQL has over the upcoming Entity Framework is its support for POCO, or Plain Old Class Objects. With LINQ to SQL, the framework doesn't require any particular base classes, interfaces or even reliance on the 3.5 framework for the resulting objects. I demonstrated this in the talk I did at the Teched Tweener weekend. Download the demo project to see this in action.

    In this sample, I created two separate projects. The first class library project, I created only targeting the 2.0 framework. As a result the project can not use any LINQ specific techniques. This will also allow us to consume the resulting objects in projects that do not have access to the newer framework, or to all of the namespaces. This is particularly important in cases like Silverlight. To call attention to the differences in the projects, I declared the 2.0 project in C# and the LINQ enabled project in VB.

    The 2.0 class library project consists of a single class file. This represents the Subject entity from the Linq In Action database.

    namespace UnmappedClasses
    {
        public class Subject
        {
            public Guid ID { get; set; }
            public string Name { get; set; }
            public string Description { get; set; }
        }

    Notice here, there are no interfaces, base classes or custom attributes. Excluding the attributes is critical here because the standard <Table> and <Column> attributes reside in the System.Data.Linq.Mapping namespace which would not be supported in the 2.0 framework.

    Admittedly, it consists of three auto-implemented properties. Auto-implemented properties are used for brevity here and are consumable by the .Net 2.0 Framework because it relies on compiler features rather than runtime features.

    Because we can't allow the class structure to include the attributes, we can't use the LINQ to SQL designer classes or SQL Metal to generate our classes. We do need to have a way to indicate the mapping to our data store. Here is where the XML Mapping file comes in handy.

    When instantiating the DataContext, we can either rely on the inline attributes, or an external mapping file. Luckily, the XML mapping file's structure is concise and very similar to the attributes that would have been applied to the class otherwise. The main difference we need to do is indicate the Type that is used for a given table since we are not directly annotating the class itself. The other difference you may notice is that I don't include the Storage attribute. While there is nothing to stop me from using that in a Mapping source, we can't identify the backing field when using auto-implemented properties.

    <?xml version="1.0" encoding="utf-8"?>
    <Database Name="lia" xmlns="http://schemas.microsoft.com/linqtosql/mapping/2007">
      <Table Name="dbo.Subject" Member="Subject">
        <Type Name="UnmappedClasses.Subject">
          <Column Name="ID" Member="ID"  DbType="UniqueIdentifier NOT NULL" IsPrimaryKey="true" />
          <Column Name="Name" Member="Name" DbType="VarChar(100) NOT NULL" CanBeNull="false" />
          <Column Name="Description" Member="Description" DbType="VarChar(200)" />
        </Type>
      </Table>
    </Database> 

    Now, with that out of the way, we can get to the LINQ portion of the work. Actually, that is quite easy. In our 3.5 enabled project, we will create a XmlMappingSource, pass it into the constructor of the DataContext and then fetch the object from this context as we would any other LINQ enabled class.

    Dim map = XmlMappingSource.FromXml(XDocument.Load("C:\Projects\LINQ\AdvancedLinqToSql\WinformDemo\lia.map").ToString)
     Using dc As New DataContext(My.Settings.liaConnectionString, map)
        Me.SubjectBindingSource.DataSource = dc.GetTable(Of UnmappedClasses.Subject)()
     End Using  
     

    This example happens to bind the results to a Winform object binding source, but you could expose it to ASP directly, through an encapsulation layer, like a repository pattern, or a service interface.

    Crossposted from www.ThinqLinq.com

  • LINQ in Action at TechEd Developer 2008 in Orlando

    If you're at Tech Ed Developer in Orlando, make sure to find me. If you miss me on the convention floor, I'll also be participating in Speaker Idol on Wednesday 6/4 at noon. In addition, I will be doing a book signing in the store at 1:00 on Thursday, 6/5. As you can see from this picture, they have a couple copies of the book that you can buy if you didn't bring your copy. I look forward to meeting you.
  • Danny Simmons compares the Entity Framework to similar technologies

    It seems that everyone else is chiming in on Danny Simmons' recent comparisons of the Entity Framework with other similar technologies. There are several items I wanted to address from his observations.

    Regarding the EF vs. LINQ to SQL, he makes two basic points: 1) That there isn't a provider model for other data sources and 2) That LINQ to SQL requires a 1-1 table to object mapping. On the second item, there is no denying the limitation. While you can work around the issue with LINQ to SQL's support for views, stored procs, and functions, this is the key differentiator between LINQ to SQL and the Entity Framework. Danny's statements regarding inheritance limitations and entity splitting both stem back to the TPH (table per hierarchy) limitation of LINQ to SQL.

    Regarding the issue with the provider model, the original intent was to have a full provider model which vendors would be able to consume. When the ADO vNext initiatives came out, it was fairly clear that the provider model would be supported in the Entity Framework. As a result, the provider model for LINQ to SQL was essentially shelved. That was the genesis of Matt Warren's series on building an IQueryable provider which basically outlines how one would go about building a provider for LINQ. Since Matt has basically handed the LINQ to SQL code base over to Danny's team, we'll see where this progresses.

    In Danny's discussions of other technologies, he seemingly makes two contradictory statements. In the beginning he states,

    "... the truth is that different problems merit different solutions."

    He then goes on to the statement:

    "This makes it easier to create a conceptual model which is how you want to think about your data and then reuse that conceptual model for a number of other services besides just building objects.  Long-term we are working to build EDM awareness into a variety of other Microsoft products so that if you have an Entity Data Model, you should be able to automatically create REST-oriented web services over that model (ADO.Net Data Services aka Astoria), write reports against that model (Reporting Services), synchronize data between a server and an offline client store where the data is moved atomically as entities even if those entities draw from multiple database tables on the server, create workflows from entity-aware building blocks, etc. etc."

    By this, he is indicating that the EF should be the defacto mechanism for building the entity structures around which the various components of your system will be built. Thus one tool to answer the various solutions. Is the EF really the best way to expose relational data from a reporting or analysis perspective, time and performance analyses will tell. He does go on in the comments to his post indicating that he is not necessarily advocating re-using the same entities for all of these various application components, but rather to use a similar tooling to create the potentially separate entities that each subsystem will consume. While it makes the programmatic portion easier, the jury is still out on the other areas.

    I'm still getting up to speed on the EF, but do have concerns regarding the POCO (Plain Old Class Objects) story and some of the other requirements that the EF puts on your entity and database models.

    Crossposted from http://www.thinqlinq.com/Default/Danny-Simmons-compares-the-Entity-Framework-to-similar-technologies.aspx

  • Projecting an unmapped property into an anonymous type with LINQ to SQL

    On page 216 of LINQ in Action, I made a comment that unmapped properties in a mapped class cannot be used in a LINQ to SQL projection. This was true with the beta bits, but only partially true with the release bits. To begin, let's consider the Author table we have in the book samples.

    The Author class has separate fields for the first and last name. Each of these is mapped to the corresponding fields in the Author table. In the book, we show how you can create a read only property in a partial class (so that it won't get clobbered when we regenerate our classes in the designer). The new property is trivial:

       Partial Public Class Author
            Public ReadOnly Property FormattedName() As String
                Get
                    Return Me.FirstName & " " & Me.LastName
                End Get
            End Property
        End Class

    Notice here that there are no mapping attributes to this property. In part, that is because there is no corresponding field in the table. As we show in the book, you are free to query the author table and return Author objects. From there, you can display the FormattedName as follows:

               Dim authors = From a In context.Authors _
                              Select a
                For Each a In authors
                    Console.WriteLine(a.FormattedName & "; " & a.WebSite)
                Next

    This works fine because we are projecting the complete Author type. However, in early builds, we couldn't project the unmapped properties into an anonymous type like this:

                Dim authors = From a In context.Authors _
                              Select a.FormattedName, a.WebSite

    If you tried to use this projection, you would get a runtime exception. In the RTM bits, the behavior was modified. Now, if you try to run the above query (sample 8.25 in the book samples for anyone following along). You will see that the query succeeds and the anonymous type is used. So how can they know how to populate the FormattedName when it is not mapped and doesn't exist in the table itself? No, the provider doesn't look inside the property, determine the mapped properties that are used, and fetch them. While that could work in our simple example, many unmapped properties  would use significantly more resources, many of which may not be members of our class, or methods without direct translation in TSQL. If you look at the generated SQL that is issued when the query is consumed, you might be able to figure out what is happening in this case.

    SELECT [t0].[ID], [t0].[LastName], [t0].[FirstName], [t0].[WebSite], [t0].[TimeStamp]
    FROM [dbo].[Author] AS [t0]

    Notice here, our select clause to the database is not optimized to only return the fields we requested. Instead, all of the fields are returned. So what's going on? They discovered in evaluating the Select clause that there were unmapped properties. At that point, they just turned around and populated a full author object. Using this object, the provider turns around and generates the anonymous type from the object rather than directly from the underlying data store directly. It's a bit of smoke and mirrors at this point.

    So the question that came up asks if the next printing of the book needs to be adjusted to remove the statement that you can't project an unmapped property. While you can indeed project these properties, you can't use them elsewhere in the query. Thus if you wanted to sort the data based on the unmapped property, the exception would be thrown. Consider the following query./p>

                Dim authors = From a In context.Authors _
                              Order By a.FormattedName _
                              Select  a.FormattedName, a.WebSite

    In this case when we try to run it, we get the following error:

    "System.NotSupportedException: The member 'LinqInAction.LinqBooks.Common.VB.SampleClasses.Ch8.Author.FormattedName' has no supported translation to SQL."

    Because of this, I plan to leave the note in the chapter warning you of using the unmapped property in your query. Unfortunately, I don't have enough space in the book to insert this complete explanation at this time. I hope this explanation helps some of you who are confused at this point.

    Crossposted from ThinqLinq.com.

  • Geek Speak resources for the LINQ Migration Strategies talk

    I had a blast on the Geek Speak today. If you missed it, they will have it available on demand from their blog at http://blogs.msdn.com/geekspeak/. You can even subscribe to the audio feed using your the Zune podcasting functionality from their rss feed. One of the things I love about these events is the variety of the questions that attendees bring.

    As I mentioned, the slide deck I used for the session is available in the Files section here as Linq Migration Strategies.

    Below are some of the questions that came up and some links to back up my answer for your reading edification.

    Q: How do you import schema into VB at the top of the file, if schema is in project it will give you intellisense?

    Beth Massi and Avner Aharoni demonstrate this in the webcast available at http://blogs.msdn.com/bethmassi/archive/2008/01/18/channel-9-interview-xml-properties-and-enabling-intellisense.aspx. They also demonstrate generating the schema from a sample XML document.

    Q: How would you replace an existing data layer with LINQ to SQL?

    I started showing how to take the Personal Web Starter Kit and begin to LINQ enable it. The completed project is available at http://code.msdn.microsoft.com/LinqPersonalWeb.

    Q: How does LINQ perform as compared to the other alternatives?

    Any time you add a level of indirection there will be some performance penalty. With LINQ to Objects, they basically leverage the iterator pattern the same way the C# 2.0 Yield method works. For LINQ to SQL, the best series on performance is from Rico Mariani starting with http://blogs.msdn.com/ricom/archive/2007/06/22/dlinq-linq-to-sql-performance-part-1.aspx

    Q: Where can we find patterns and practices guidance on LINQ

    This post is more along the lines of the Framework Design Guidelines book, but it is good information anyway: http://blogs.msdn.com/mirceat/archive/2008/03/13/linq-framework-design-guidelines.aspx

    There is also a Live from Redmond VB9 webcast discussing best practices at http://msevents.microsoft.com/CUI/WebCastEventDetails.aspx?EventID=1032337466

    Additional "How do I" videos are available at  http://msdn2.microsoft.com/en-us/vbasic/bb466226.aspx

    Q: What advice do you have for passing data across tiers? If we can’t pass context across tiers, can we pass resultant objects?

    There isn't really a single difinitive post on this one. Searching for "DataContext" and "Short lived" or "Unit of work" should supply a number of examples, including the following:

    http://msdn2.microsoft.com/en-us/library/system.data.linq.datacontext.aspx

    This forum post includes responses by Keith Farmer and Matt Warren who were both intimately involved with creating LINQ to SQL: http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2485909&SiteID=1. Matt states, "You'll most often want a new context for every interesting unit of work.  DataContext's should be short lived if they can be. The only scenario where you'd keep a DataContext around for a long time would be if you were bulding the equivalent of a 2-tier UI based data-entry/editting appliction."

    Dino Esposito notes: http://weblogs.asp.net/despos/archive/2008/03/19/more-on-datacontext-in-hopefully-a-realistic-world.aspx

    Some additional resources mentioned:

    Rick Strahl: http://www.west-wind.com/weblog/default.aspx

    Beth Massi: http://blogs.msdn.com/bethmassi/

    MSDN LINQ forum: http://forums.microsoft.com/MSDN/ShowForum.aspx?ForumID=123&SiteID=1

    Q: Regarding Linq to Entities, I hear that the entities will not use the same change tracking model to permit emitting across tiers, by making the EntitySets serializable and have the changes tracked on a set by set basis, can you confirm deny? Info on this?

    I don't know enough to comment on this. I would recommend asking the question on the ADO.NET Entity Framework forum at http://forums.microsoft.com/MSDN/ShowForum.aspx?ForumID=533&SiteID=1.

    Crossposted from ThinqLinq.com

  • GeekSpeak discusses LINQ Migration Strategies

    Tomorrow, Wednesday 4/2/2008, I will be the guest speaker on the Geek Speak webcast. We will be discussing strategies for beginning to incorporate LINQ into your existing application infrastructure. In many cases, that does not mean replacing your entire data stack, but rather using pieces of LINQ to add functionality and in new components. Please join us. The Geek Speak webcasts are often driven by attendee questions. The more questions, the better the event.

    When: Wednesday, April 02, 2008 12:00 -1:00 PM (GMT-08:00) Pacific Time (US & Canada) or 3:00 - 4:00 PM Eastern Time.
    Where: Live Meeting
    Registration URL:
     
    Crossposted from www.ThinqLinq.com.
  • LINQ enabled Personal Web Starter Kit

    I love it when projects take a life of their own. A while back, I posted my LINQ enabled  Personal Web Starter Kit in VB and received several requests to provide a C# port. Thankfully, one brave soul stepped up and did the port for me. Thanks go to Stephen Murray for undertaking the challenge. As is often the case, one of the best ways to learn a technology is to use it.

    If you're interested in this sample, you can check out the project at the MSDN code center. Specifically, you can access the original VB version or Stephen's C# Port. You can read more about this project on the Thinq Linq web site. As always, let us know what you Thinq.

  • LINQ in Action going to press

    At long last, the process of doing my first book is coming to a close. I started this project last March. Through the process we had to revisit our work numerous times, including each time a new CTP or Beta drop came. For me, 10 months, and Fabrice 2 years later, we found out this week that the book is going to press.

    What does this mean for you, if you purchased the eBook, the final version is available now. Additionally, the samples are available online in both C# and VB. We are also making three chapters available for free if you are considering the book, but not sure yet.

    If you purchase the hard copy from Manning, we understand that it should be in around the first of February. This should mean that it will be shipping from the online outlets, like Amazon by the middle of February.

    I hope you find the book as rewarding to read as it did for us to write.

    crossposted from www.thinqlinq.com

    Crossposted from http://devauthority.com/blogs/jwooley/default.aspx
  • Annoucing ThinqLinq.com

    You may have seen me present it at a speaking engagement. You may have watched the podcasts. You may have even downloaded the sample application. Now you can see it in action.

    ThinqLinq.com is now live.

    The site was designed completely in VB with LINQ as the data access mechanism. The base application was built in 2 hours from not knowing RSS to being able to import a RSS feed, displaying it on a form and producing a new feed from the imported data. The site is a testimate to the power of LINQ and the RAD capabilities that it brings. Head on over to the site and check it out.

    Crossposted from http://devauthority.com/blogs/jwooley/default.aspx
More Posts Next page »