Key takeaways from NRF’s BIG Show 2014

 Ian Rogers

Ian Rogers

National Retail Federation’s 103rd annual BIG Show broke records with 30,000 attendees, 500 exhibitors, and 120 educational sessions. As usual, all the hot topics on the minds of today’s retailers were on the agenda, as well as a broad range of diverse technology exhibits.

Leading retailers from across the country were present, with different business goals but a seemingly shared vision: improve the customer shopping experience across multiple channels—online and off. Here’s a rundown of some of the few key themes and notable technologies that came up time and again at this year’s show. Read more of this post

Changing the on-shoring vs. offshoring equation

Mike Cowden

Mike Cowden

Earlier in my career, I had an opportunity to work in Manila at a large consultancy where I supported clients in their global development efforts. In addition to experiencing the amazing culture of the Philippines, I was front and center as offshoring took hold. Companies around the world saw the promise of lower costs, easy access to skilled resources, and the opportunity to chase the sun with the utopia of a 24-hour operation. These promises, coupled with the pressure to drive growth through cost cutting, created momentum to move to offshore services and solutions with little thought for the long-term impacts on the ability to innovate to support the business. Read more of this post

How cloud is transforming retail

Ian Rogers

The retail sector is experiencing a period of massive change on a scale not seen in at least a decade. Traditional lines of competition are blurred and the barriers to entry significantly reduced. Competition is no longer constrained by traditional rivals and the main differentiators of price, quality, brand, and location. The rise of new consumer technology has enabled competition from manufacturers, small online startups, and even long distance online retailers.

Read more of this post

One-cloud solution takes a media company to new heights

Art Fort

Art Fort

For the past year, I’ve been working with a media and entertainment client on a SaaS application. Hosted on Amazon Web Services (AWS), the application provides creative professionals with a “one-cloud” solution for media and content collection, collaboration, production, and transcoding of high-value, high-definition content.

I’ve been fortunate to be part of this project from the beginning—starting with some small proof of concepts, to doing a cloud migration assessment of their on-premise technologies, and ultimately contributing to application architecture, development, and deployment onto AWS. For me, two things set this project apart: the team and the technology. Read more of this post

Maximizing store space with localized inventory assortments

Ian Rogers

It’s an age-old challenge for retail: what is the ideal breadth and depth of product offerings in a store that meets competing objectives, including:

  • Maximizing store turnover vs. maximizing product selection and availability
  • Store-to-store consistency vs. local market customization
  • Store vs. digital product selection and availability

This challenge has become even more difficult in this customer-centric omni-channel environment. Customer expectations and shopping behaviors have changed, which is challenging the role the physical store plays. Stores used to be the primary way of selling products, so the focus was on how to drive foot traffic to the store. Sales were then driven by the product availability and customer service in the store. This role is changing as the relationship between customers and retailers becomes more complex and personalized. Stores are more and more becoming the medium for brand building, engagement, and product showcasing to drive sales across all channels. Read more of this post

Eleven Game-Changing Advances in Microsoft BI


Ten oughta do it, don’t you think? You think we need one more? You think we need one more don’t you? Ok, we’ll get one more.

I came home from the Microsoft SharePoint Conference 2012 fired up about all of the changes in the way Microsoft approaches BI, which to me are the most sweeping changes since the release of SQL 2005.

Like SQL 2005, I think that these up and coming technologies will change the way Microsoft delivers BI, in ways which may not be obvious yet but will emerge over the next three to five years. For example, when I recommended in 2007 that my company use Excel to deliver ad hoc reporting instead of a standard BI ad hoc solution like WebI, my CEO thought I was nuts. Even I didn’t understand at that time how integral Office would be to BI delivery, so much so that I’m questioning the future of SSRS (more on that in a minute).

I thought this would be a great time to identify emerging trends along with some bold (and not so bold) predictions about how these latest advancements will change Microsoft BI delivery. I asked my colleague Patrick Brady to join me in writing a list of some of these predictions, since he had recently been to PASS 2012 as a returning participant and a speaker. We compared notes to create a top-ten list of game-changing features and possible upcoming trends that we could write about. After paring down the list, we found we couldn’t do less than eleven without feeling like we were leaving something important out.

So with that, here are eleven game-changing advancements in Microsoft BI and some thoughts on what the future may hold.

Big Data

Big Data is a Big Mystery to a lot of CIOs, other than those in industries like e-commerce and social networking who have been at the forefront of understanding and advancing these concepts. However, the use of Big Data applications and related opportunities will expand and grow further in 2013 as more organizations finally begin to understand the scope of this new technology and realize the value of being able to capture and analyze large volumes of ever-changing unstructured and semi-structured data. As part of a strategic partnership announced in late 2011 with HortonWorks, Microsoft recognized early on the emerging market opportunities surrounding Big Data and have been busy creating HDInsight Server and Windows Azure HDInsight Service.

HDInsight is certified by Microsoft to run Hadoop on Windows Server and will suit organizations that want a dedicated on-premises big data implementation. Azure HDInsight Service provides Big Data as a Service (BDaaS) in the cloud to those organizations with occasional to frequent needs. Both products promise a simpler Big Data entry point for those organizations that have been sitting on the sidelines up to this point.

Social Analytics

Social Analytics can be defined as the process of analyzing customer sentiments through the mining of data available from social networks such as Facebook, Twitter, Tumblr, LinkedIn, Google+ or private social networks such as Yammer. Its value can be demonstrated by showing how to use social analytical techniques to support marketing activities, assist with customer support and identify opportunities for future product development.

With Microsoft’s acquisition of Yammer, coupled with the SQL Azure Lab named “Social Analytics,” the question must be asked: what is Microsoft planning in this space? This question further begs an answer when consideration is taken into account of Yammer’s recent partnership announcement with Kanjoya, a vendor specializing in sentiment analysis. All signs point to upcoming announcements in 2013 from Microsoft regarding specific product offerings related in Social Analytics.

PowerPivot Gallery

PowerPivot is not a new technology, as it was first introduced as part of SQL 2008 R2 (see below for more on in-memory analytics). Even PowerPivot Gallery is not new as it was introduced as part of SharePoint 2010. However, the PowerPivot Gallery benefits from multiple enhancements in SharePoint 2013 that greatly improve the ease and usability of the PowerPivot Gallery.

PowerPivot is now a built-in functionality of Excel Services. You no longer need to install separate instances of PowerPivot for SharePoint. Designating an SSAS tabular instance in the data model settings of Analysis Services, you can enable your PowerPivot Gallery. You now also have the ability to drag and drop PowerPivot files into the gallery, much like you can now use drag and drop for other documents in SharePoint. Also, using a Business Intelligence Semantic Model (BISM) is no longer the only possible source for a Power View report. You can build a Power View report off of existing PowerPivot documents in the gallery.

By properly training the information workers in your organization, the PowerPivot Gallery will enable your end users to create powerful in-memory visualizations without requiring report building from IT.

GeoFlow in Excel

Microsoft has mostly focused on utilities that support geospatial reporting, such as the company’s geography/geometry data types and the extensibility of Bing Maps. Other companies such as IDV Solutions—the creators of Visual Fusion­—have utilized Microsoft technologies to make geospatial reporting possible. But at the SharePoint Conference 2013, Microsoft proudly introduced GeoFlow, an Excel add-in that takes advantage of Bing Maps and the same xVelocity technology as PowerPivot to produce 3D, interactive, data-driven maps within Excel.

While GeoFlow is a bit limited at this point, you can’t beat the price (included free with Excel 2013 or higher) or its ease of use. It doesn’t have all of the power that Visual Fusion has with its XML scripts and Silverlight SDK capabilities, but the developers who introduced the project are very enthusiastic and we expect the capabilities will expand rapidly.

Our colleague Marek Koenig wrote a great descriptive piece on GeoFlow, which you can access here.

Excel Services Improvements

Excel Services has been around ever since Microsoft Office SharePoint Services 2007 (MOSS), but has been an underutilized part of the Office/BI revolution. The selling point sounds great: users can create their own reports and share them online, and you don’t need to have Excel installed. That was usually met with a look that said “who doesn’t have Excel?” And the amount of interaction you gave up usually made Outlook the sharing technology of choice for connected Excel documents.

Excel 2013 introduces Excel Interactive View and gives the user the ability to view multiple worksheets, interact with data, and build charts and graphs in an HTML client. Once the information workers understand these major increases in features and flexibility, you can anticipate Excel Services to take off the way many thought it would in 2007.

As interactive reporting features continue to be added to an already very easy-to-use platform, I could see SSRS being featured less prominently in Microsoft’s BI stack in the future.

Visio Services

Most people think of Visio as just a great tool to draw flow diagrams and org charts. And when Visio Services was created, most people thought of Visio Services as just a way to render flow diagrams and org charts online. Prior to 2013, you could only use data linking to connect to data graphics in Visio and Visio Services. However, in Visio 2013 and Visio Services 2013, you can now connect data to shape properties such as size and position, visibility, color, and geometry. You can even create or import custom shapes with custom properties that can also be connected to data.

For example, during Chris Hopkins’ presentation at SharePoint Conference 2012, he showed a retail example using a floor plan of a store, and circular racks which would be more or less fully based on inventory data residing in a database. To a retail buyer or merchandising coordinator, nothing spurs action like the image of an empty rack. You can also use hyperlinks in your shapes that allow you to drill to reports or other dashboards to get more detailed looks, now that you have the user’s attention.

You can also perform high-level customizations using Vwa Namespace in the JavaScript object model, or use the Visio Services class libraries to make custom data connections. You can find a listing of new Visio Services features for 2013 by clicking here.

Parallel Data Warehousing

In 2010 Microsoft shipped SQL Server 2008 R2 Parallel Data Warehouse (PDW), its first enterprise-class parallel data warehouse appliance. After a number of updates to the product, Microsoft will release its next generation PDW appliance in the first half of 2013. SQL Server 2012 Parallel Data Warehouse is widely expected to include a redesigned architecture with significant improvements, including greater performance and a reduced hardware footprint at a lower cost. Also, as a result of the growing need to integrate relational database data with big data sources such as Hadoop, this latest version of PDW will include one significant enhancement: PolyBase. PolyBase will enable analysts with the ability to query data from both relational databases and Hadoop using a single unified query statement. It promises to reduce much of the complexity associated with accessing Hadoop data and its integration with traditional relational-based data during analysis.

So with the continued need for faster analytics of ever-growing volumes of data and the rapidly growing emergence of Big Data, along with a lower-entry cost point than offered by its competitors, Microsoft’s SQL Server 2012 Parallel Data Warehouse is expected to become a more common cornerstone of BI department solution offerings across corporate enterprises during 2013, and beyond.

In-Memory Analytics

Continuing on the success of its development of in-memory data analytics such as that used in xVelocity, Hekaton (the Greek word for 100 times), is Microsoft’s new in-memory technology for online transaction processing (OLTP) databases. Similar in implementation to xVelocity technology, Hekaton employs compression techniques that promise to greatly increase the speed of data processing in transactional application databases. Note that Hekaton is a project code name and is expected to be released in the next version of SQL Server. Few details are available, but one notable item that we do know is that developers will have the option to select either tables or entire databases to host in-memory. We should be hearing more about Hekaton from Microsoft as the year progresses.

Mobile BI

With the emergence of smart phones and the advent of tablet computing, it only seems natural that business users should be able to access BI analytics while on the go using their mobile devices. To date, there have been some notable players who have proved successful in the Mobile BI market (RoamBI comes to mind), but it seems that up until recently, Microsoft has mostly ignored this important aspect of business intelligence. To a certain extent this changed with the release of SQL Server 2012 Service Pack 1 in November 2012. The service pack included a new feature enabling Reporting Services reports to work better interactively on iOS enabled devices. In addition, Microsoft recently provided a demonstration of its new and yet to be released Mobile BI solution named “Project Helix” at SharePoint Conference 2012.

Unfortunately there is currently very little information available on “Project Helix” other than what has been reported from tweets and blog posts resulting from the demonstration shown during SharePoint Conference 2012. Regardless, it would appear that Microsoft has turned a corner and definitely has plans for mobile BI. More will be revealed as 2013 progresses.

Cloud BI

Windows Azure SQL Reporting was made available during a spring 2012 preview, with its pricing model going into effect August 1, 2012. Still, talk of Azure Reporting was quiet both at both the SharePoint and PASS conferences compared to other technologies. Many people who have used Azure Reporting have found it difficult to set up and somewhat limited in its offering; for example, it does not provide a semantic layer comparable to Analysis Services. Some have also felt like the per-hour pricing model wasn’t for them.

For many companies, having a BI server on-premises will make sense for them, as native connectivity to SQL Azure has increased with the release of SQL Server 2012. While this may seem to defeat the purpose of hosting on the cloud, smaller organizations that mainly use their BI internally will find that the hardware costs of hosting Analysis Services and Reporting Services are not prohibitive if the larger OLTP and data warehouse layers are hosted in SQL Azure.

And since this is a post about predictions, we predict there is more to come in Microsoft cloud BI in the next few years, especially with Microsoft’s strategy of releasing many new features to Azure first.

SharePoint/Office Apps and the Apps Store

If you open Excel 2013, you will notice that “Blank workbook” is just one of the first options that hit your screen. You also have the option to open a host of other templates, such as “My financial portfolio” or my personal favorite, “Weight-loss tracker.”

Don’t adjust your set; these are actually Excel 2013 apps. Apps are new in the world of Windows 8 and Office 2013, and more or less replace the idea of add-ins. Architecturally, they involve HTML5, CSS3, and JavaScript and use OAUTH, REST, and other web protocols to connect to the apps which are hosted either on premise or in the cloud. These services can also call up additional hosted or third-party data (including Microsoft-provided data sets) and integrate it into the apps delivery. These apps are made available on the Windows App Store (sound familiar?) with a model that allows the developer to take in a substantial share of whatever revenue they generate.

So what does this mean for BI? The idea of writing code to create “reporting tools” might have seemed as foolish before as using code to write a new spreadsheet application. However, this simple architectural model combined with the reporting improvements in Excel and the rest of Office will make it easier to write simpler, targeted, easily customizable reporting apps delivered in Office and SharePoint. Combining that with the distribution ease of the Windows App Store, we believe SharePoint and Office apps will play prominently in BI delivery in the coming years.

Slalom Consulting Solution Architect Patrick Brady was a co-contributor to this post. These authors are members of Slalom’s Information Management Thought Leadership Committee. For more information, email the team at

MCM: Elevating Our Technical Expertise

There are a few things that make Portland unique—our affinity for great food (often food that comes out of a 4’ by 6’ metal food cart), our identity as a lifestyle destination (where else can you ski in the morning and be at the beach in the afternoon?), and just over 1 million distinct individuals all with their own flair and personal style. At Slalom’s Portland office, we are celebrating yet another thing that makes us unique: we now have the privilege of working alongside Microsoft Certified Master Kyle Petersen!  Kyle is one of one of a select few in the US (there are approximately 30) to have earned this certification for Microsoft SharePoint 2010 in the US and is only one of around 80 in the world! Please join me in congratulating Kyle in this momentous achievement and read more about the importance and value his MCM will bring to our clients and to Slalom in my Q&A with Kyle below:

What exactly is a “Microsoft Certified Master’?

The MCM Certification is the highest technical certification that Microsoft offers for some of its key technologies (e.g., Exchange, SQL, Lync, and SharePoint). What really differentiates this certification is the technical breadth, depth and the requirement to truly demonstrate your technical mastery.

With many of the Microsoft Certifications there are lots of people who can buy exam guides, study and pass the exams without ever having actually used the technology or skill.   That is not possible in the MCM program because you have to not only know the answers, but understand the concepts and be able to demonstrate your expertise.

You can learn more about the certification here and here.

What must one achieve in order to be considered a Master?

Assuming you have 3 years of experience with SharePoint 2007 and SharePoint 2010, you will have to:

1. Pass the four basic SharePoint certification exams:

  • Exam 70-573: TS: Microsoft SharePoint 2010, Application Development
  • Exam 70-576: PRO: Designing and Developing Microsoft SharePoint 2010 Applications
  • Exam 70-667: TS: Microsoft SharePoint 2010, Configuring
  • Exam 70-668: PRO: SharePoint 2010, Administrator

2. Submit an Application to the MCM program containing your Resume and descriptions of the types of projects you have worked on.

3. Pass a phone screen to ensure you are technically ready to enter the program.

4. Complete the pre-reading list to ensure you have the basic fundamentals covered.

5. Complete 3 weeks of in-depth technical training.

6. Pass a 4-hour online knowledge exam.

7. Pass an 8 hour hands-on Qualification Lab that demonstrates your expertise.

Briefly describe your experience in the MCM bootcamp for the 3 weeks prior to the exam.

First off, I don’t think the term “bootcamp” is really applicable. That has a connotation in the development community where you pay to go off and get trained and come out with the guaranteed certification.

The MCM training rotation is much more in depth and broad. Classes typically last 10 hours a day of 400+ level content. While there are lots of PowerPoint slides (over 2,000) the real information is delivered between the bullet points, so you have to stay engaged in the process. Class instructors are a mixture of MCMs, Microsoft Product Team employees, and Microsoft MVPs. They are the best in their fields and help provide amazing context to the subjects.

The training also provides hands-on labs to help solidify the skills that were covered and to help us explore the capabilities of various SharePoint features. Completing these labs is critical to fully understanding the concepts, so the labs consumed every evening and weekend.

So a typical day was get up and head into class. Class went from 8 a.m. to 6 p.m. with some nights going past 7 p.m. Breaks are brief, and you get a quick lunch at the Microsoft Cafeteria. After class it was back to my apartment for dinner and then spend time working on the labs. Then I would review the training materials and make notes that I could use for studying for my knowledge and qualification exams. Try and get some rest and then repeat. Weekends were a chance to get caught up on labs I had not completed or did not understand well enough yet.

The last 2 days are for the certification tests. First is a 4-hour knowledge exam that is extremely challenging.

On the second day is an 8-hour hands-on qualification exam where we must complete assigned tasks. You have to fully know the subjects covered because there just is not enough time to able to research an answer. That was the fastest 8 hours that ever slipped by because I was so absorbed and focused on trying to get it all completed within the time limit.

The pass rate for these exams is less than 50%. In my rotation there were 14 of us and only 7 passed. However, you are allowed up to three tries to pass the exams, but there are substantial costs involved with each re-take.

How many Masters are there? Why so few?

I think the number-one reason there are so few is because of the cost and time commitment required to complete the certification. While the actual training was only 3 weeks, I spent the prior 3 months going over the pre-reading list and working on labs and examples to be sure I understood the concepts. And for consultants, three weeks of unbillable time can really mess up your overall utilization rate.

The second reason is that it’s really hard. While there are a lot of amazing SharePoint developers, they don’t necessarily have the infrastructure experience to be able to setup a SharePoint farm. And there are a lot of great SharePoint administrators that don’t know how to write a custom web part.  A SharePoint MCM requires end-to-end and top-to-bottom knowledge of the SharePoint product.

You can find the list of all of the MCMs here. I believe there are about 80 MCMs for SharePoint 2010 worldwide and about 30 in the US—including Microsoft employees.

How will this certification ultimately benefit our clients?  

Slalom Portland has, for several years, had a very strong SharePoint team and has helped many clients in the Portland area use SharePoint to run their businesses better. In addition to providing our clients with a level of comfort that Slalom has the most qualified resources possible, the MCM program gives its members information about SharePoint which is not available to the public or even to Microsoft’s highest level partners in other programs. MCMs are provided with this information earlier and more in depth than any other non-Microsoft group and that enables Slalom to make better recommendations to our customers and be more efficient when troubleshooting issues. The MCM community also stays in close touch, jointly contributing on solutions to the toughest SharePoint challenges out there, so any member can raise questions and have the others weigh in. Additionally, MCMs have unprecedented access to the Microsoft product team which goes beyond even Slalom’s access as a nationally managed gold partner.

All of this enables Slalom to provide our clients with the best possible solutions, fully understanding the implications of design decisions. For example: over the past few months our clients have been asking us to design solutions in SharePoint 2010, sometimes highly customized, which will upgrade easily to 2013 or when moving to the cloud. The MCM program gives Slalom one more very powerful tool for making the best design choices and recommendations for our clients on their SharePoint roadmaps.

Why do it? 

When I first heard about the MCM program it was in the context of the 3 weeks of deep technical training. I thrive on the 400-level sessions at SharePoint conferences and thought that having access to 3 weeks of that level of training was an amazing opportunity.

Then I learned about the rest of the program—the prerequisites, the exams, the pass rate—and I was really intimidated and not sure I had the “right stuff.” Portland is a small market and we don’t often get the chance to work on large-scale enterprise solutions, so I felt I just didn’t have the exposure to the breadth of skills.

But ultimately the prize of the deep technical knowledge pushed me to take the chance and apply to the program. Getting training from the people responsible for the product features and the technical subject matter experts is such an amazing experience. It is not for everyone. It is hard, and it will test you. But passing the MCM means I can say “I know SharePoint.”

Congratulations,Kyle! We are proud to work alongside a true Master!


Tech Trends for 2013

Daniel Maycock is one of Slalom’s acknowledged thought leaders in the realm of new and emerging technology.

There were many significant technology advances during 2012 in a number of key areas, including the mainstream adoption of LTE, Big Data, and analytics dominating the enterprise IT agenda.

Companies went from adopting cloud platforms and services to leveraging those services and transforming their businesses.

  • Windows 8 has shown just how important Internet connectivity will be for computing in many capacities.
  • Every major IT vendor has focused to some extent on the convergence of mobile, cloud, analytics, social, and helping companies make IT a central part of their business in every aspect.
  • From SalesForce to Azure, cloud-based solutions are expected to grow even more in 2013.

As more and more companies begin waking up to this new reality, the question is not if adoption of key technologies such as cloud and mobile will take place, but how quickly and what can be done to make them work for the business as fast as possible. Furthermore, as these technologies are integrated deeper into the enterprise, it will be critical to keep in mind what other technologies will follow in their path. Read more of this post

Building Dynamic Services Using Web Api and Odata

Slalom Consultant Joel Forman

Slalom Consultant Joel Forman is a Solution Architect specializing in cloud computing.

Over the past few years I have been building more and more RESTful services in .NET. The need for services that can be used from a variety of platforms and devices has been increasing. Often times when we build a service layer to expose data over HTTP, we may be building it for one platform (web) but know we know that another platform (mobile) could be just around the corner. Simple REST services returning XML or JSON data structures provide the flexibility we need.

I have been very excited to follow the progress of ASP.NET Web API in MVC4. Web API makes creating RESTful services very easy in .NET, with some of the same ASP.NET MVC principals that .NET developers have already become accustomed to. There are some great overview and tutorials on creating ASP.NET Web APIs.  I encourage anyone to read through the tutorial Your First Web API for an initial overview.

Another important area to address when it comes to writing RESTful services is the topic of how to provide the ability for consumers to query data. How consumers want to retrieve and interact with the data is not always known, and trying to support a variety of parameters and operations to expose the data in different ways could be challenging. Enter the Open Data Protocol. OData is a new web standard for defining a consistent query and update language using existing web technologies such as HTTP, ATOM and JSON. For example, OData defines different URI conventions for supporting query operations to support the filtering, sorting, pagination, and ordering of data.

Microsoft has been planning on providing support for OData in ASP.NET Web API. In the past few weeks, Microsoft released an OData Package for Web API via NuGet enabling the ability for OData endpoints that support the OData query syntax. Here is a blog post that describes the contents of that package in more detail.

Let’s take a closer look at how to take advantage of this new OData library. In this example below (being a consultant), I build a simple API for returning Projects from a data repository.

Web API provides an easy development model for creating an APIController, with routing support for GET, POST, PUT, and DELETE automatically set up for you by the default route for Web API. When you create a default APIController, the GET method returns an IEnumerable<T> where T is the model for your controller. For example, if you wanted a service to return data of type Project, you could create a ProjectsAPIController and your GET method would return IEnumerable<Project>. The Web API framework handles the serialization of your object to JSON or XML for you. In my example running locally, I can use Fiddler to make requests against my REST API and inspect the results.

Query: /api/projects

After adding a reference to the Microsoft ASP.NET Web API OData package via NuGet, there are only a couple small tweaks I need to make to my existing ProjectsAPIController to have the GET method support OData query syntax.

  1. Update the return type of your query (and underlying repository) to be IQueryable<T>.
  2. Add the [Queryable] attribute to your method.

What is happening here is that by returning an IQueryable<T> return type, we are deferring the execution of that query. The OData package then, through the Queryable attribute, is able to interpret the different query operations present on the request and via an ActionFilter apply them to our Query before the query is actually executed.  As long as you are able to defer the actual execution of the query against your data repository in this fashion, this practice should work great.

Let’s try out a couple of queries in Fiddler against my API to show OData in action.

Query: /api/projects?$filter=Market eq ‘Seattle’

Query: /api/projects?$orderby=StartDate desc&$top=5

The list of OData URI conventions for query syntax is extensive, and can be very powerful.  For example, the TOP and SKIP parameters can be used to provide consumers with a pagination solution to large data sets.

One key concept to understand with OData with Web API is understanding the capabilities and limitations around when the query parameters need to be applied against your data repository. For instance, what if the data repository is not able to support returning an IQueryable<T> result? You may be working with a repository that requires parameters for a query up front. In that case, there is an easy way to access the OData request, extract the different parameters supplied, and apply them in your query up front.

The mapping of operations and parameter to your repository’s interface may not be ideal, but at least you are able to take advantage of OData and provide a standards-based experience to your consumers.

Support for building RESTful APIs in .NET just keeps getting better and better. With ASP.NET Web API and the upcoming support for OData in Web API, you can build dynamic APIs for multiple platforms quickly and easily.

Windows Server 2012: Part 6—Hyper-V

Slalom consultant and accomplished Microsoft systems developer Derek Martin sheds light on Windows Server 2012 (WS12) through his insightful blog series focusing on his research within the technical preview documentation, personal experimentation with the product, and thoughts of how they can apply to the real world as soon as it is released to manufacturing (RTM).

Slalom Consultant Derek Martin

Slalom Consultant Derek Martin is an accomplished Microsoft systems developer and integrator, experienced in developing and deploying SharePoint and CRM solutions, integrating line of business applications, and leveraging existing infrastructure investments.

In Windows 2012, the concept of the private cloud is finally at your fingertips. Long gone are the half-baked, half delivered features of Windows Server that promised ‘virtualization.’ VMWare had Microsoft and the rest of the cloud folks well under control which explains, at least in part, their very unpopular price increase when VSphere 5 rolled out.

In their defense, I’ve never seen a popular price increase but I digress. Windows 2012 introduces so many new features into the basic OS that makes it the premiere choice for building clouds. First among them are the new features within Hyper-V—Microsoft’s hypervisor. Without the other features I’ve already discussed in previous entries of this series, Hyper-V would still have gotten mad props for all the great changes they have made. With those other features, MSFT has truly moved the bar and now it will be the other players that play catch up. Again, following along our list of content from TechNet, we dive in and look at the major changes/additions: Read more of this post

%d bloggers like this: