QA InfoTech , Independent Software Testing Service Logo
jobs@qainfotech.com
Sales: Contact Sales +1 469-759-7848 sales@qainfotech.com
QA Infotech your Software testing partner
Menu
  • About
    • Team
    • Overview
    • Values and Culture
    • QA InfoTech Foundation
    • Careers
  • Services
    • QUALITY ENGINEERING
      • Functional Testing
      • Automation Testing
      • Mobile Testing
      • Performance Testing
      • Accessibility Testing
      • Usability Testing
      • Security Testing
      Quality ASSURANCE
      • Globalization, Translation & Localization Testing
      • Courseware & Content Testing
      • Crowdsourced Testing
      • Cloud Testing
      Software Development
      • eLearning
      • Data Sciences
      • Accessibility Development
      • Mobility Solutions
      • Web Development
      • Front End Frameworks
      • Salesforce Development
      • Cloud Solutions
      • Enterprise Content Management
      • Odoo
      • ServiceNow
      • AS400
      Digital Assurance
      • Digital Assurance
      • Data Sciences & Analytics
      • Quality Consulting & Test Center of Excellence (TCOE)
      • SAP Testing
      • Selenium Test Automation
      • Blockchain Testing
  • Verticals
    • e-learning
      List Start
    • e-Learning
    • Publishing
    • BFSI
    • Health Care
    • Media
    • Travel
    • Retail
    • Government
    • OpenERP
    • List Stop
  • Knowledge Center
    • Case Studies
    • White Paper
    • Webinars
    • Blogs
  • WebinarsNew
  • News
    • News
    • Press Release
  • Contact
  • Get A Quote
  • Home
  • »
  • Business
18 Aug, 2017

A handy testing checklist for a developer

  • DevLabs Expert Group
  • Business
  • no comments

Over the last few years, the industry is slowly but steadily buying into the fact that quality is a collective responsibility. While it is the tester’s neck that’s out in the line for quality issues/missed bugs, today it is welcoming to see everyone step in to understand what role they could have played in avoiding such an issue in the first place. Also the industry as a whole is starting to appreciate not just a reactive analysis but the need for proactive intervention upfront for empowering such collecting ownership. Read more »

User Experience in the World of E-learning app development
18 Aug, 2017

User Experience in the World of E-learning app development

  • DevLabs Expert Group
  • Business
  • Tags: E-learning App Development
  • no comments

E-learning App Development is unique in its own way – while it imbibes traditional app development practices, one standout element herein is user experience – and the user base is often very large and diverse – it could be a kinder garden kid, it could be the top exec of a large organization. There’s no bound to scoping the profile of end users that leverage e-learning apps, making user experience the key to the app success. Read more »

Need for holistic perspective in app development
18 Aug, 2017

Need for holistic perspective in app development

  • QA InfoTech
  • Business,DevLabs Expert Group
  • Tags: App development services, mobile app development, multi-platform mobile development, Smartphone app development
  • no comments

Developers have a large responsibility on their plate, don’t they? Today, quality is a collective responsibility – while we say testers have a responsibility to look at the application from a big and end user perspective, developers are also in stake today – they cannot merely look at system requirements to code away. While this is a starting point, a good developer does a lot more. He/she looks at the application / module under development holistically from a functionality, performance, security, usability angles amongst others. A responsible developer also has robust unit test cases to ensure the quality of the code he writes. He collaborates with testers to understand the test scenarios, quality of the module, where the application is falling short of, what else can be done, to ensure the chemistry is great both from the relationship he shares with testers as well as the product readiness for release.

Read more »

Choose the Work Flow for Entity Framework
07 Apr, 2017

Choose the Work Flow for Entity Framework

  • nehasaini_qait
  • Business,Dot Net
  • Tags: database, dotnet, entity framework, framework, technology
  • no comments

framework

Two things are therein mind while choosing the workflow.

Things that are outside our control:

New database/Existing Database:

Thing that are inside our control:

Creating model using design or writing code.

Model First—

  1. Create model in designer.
  2. Generate DB from the model.
  3. Classes that are going to interact auto generated from model.

Database First—

  1. Reverse engineer model in designer.
  2. Classes are auto-generated from Model.

Code First (New Database) —

  1. Find my model in code. Model is made up of main classes that are going to interact with the application. Optionally can provide the code for mapping and configuration.
  2. Database is created from the model.
  3. If I can my model, then can use the Code First Model to evolve database.

Code First (Existing Database) —

  1. Define classes and mapping in code.
  2. Reverse engineering tools are available.
07 Dec, 2015

Difference Between Array and Arraylist

  • Yogeshwar Singh Chauhan
  • Business
  • no comments

Array

Arrays are strongly typed collection of same datatype and these arrays are fixed length that can’t be changed during runtime. Arrays we will store values with index basis that will start with zero. If we want to access values from arrays we need to pass index values

Declaration of Arrays

We can declare arrays as shown below.

  1. using System;
  2. namespace WebApplication2
  3. {
  4.     public partial class _Default: System.Web.UI.Page
  5.     {
  6.         protected void Page_Load(object sender, EventArgs e)
  7.         {
  8.             // Initialize the array items during declaration
  9.             string[] strNames = new string[]
  10.             {
  11.                 “Ramakrishna”,
  12.                 “Praveenkumar”,
  13.                 “Indraneel”,
  14.                 “Neelohith”
  15.             };
  16.             // Declare array and assign values using index
  17.             string[] strNamesObj = new string[4];
  18.             strNamesObj[0] = “Ramakrishna”;
  19.             strNamesObj[1] = “Praveenkumar”;
  20.             strNamesObj[2] = “Indraneel”;
  21.             strNamesObj[3] = “Neelohith”;
  22.         }
  23.     }
  24. }

Arraylist

Array lists are not strongly type collection. It will store values of different datatypes or same datatype. Array list size will increase or decrease dynamically it can take any size of values from any data type. These Array lists will be accessible with “System.Collections” namespace.

Declaration of Arraylist

To know how to declare and store values in array lists check below code.

  1. using System;
  2. using System.Collections;
  3. namespace WebApplication2
  4. {
  5.     public partial class _Default: System.Web.UI.Page
  6.     {
  7.         protected void Page_Load(object sender, EventArgs e)
  8.         {
  9.             ArrayList arrListObj = new ArrayList();
  10.             // Adding string value to arraylist
  11.             arrListObj.Add(“Ramakrishna Basagalla”);
  12.             // Adding integer value to arraylist
  13.             arrListObj.Add(1981);
  14.             // Adding float value to arraylist
  15.             arrListObj.Add(11.99);
  16.         }
  17.     }
  18. }

In the above example we have not specified any size to an array list, we can add any number of data there is no size limit.

Difference between Array and ArrayList

ArrayArrayList
Arrays are strongly typed collection of same datatype.ArrayList are not strongly typed collection we can store different types of data.
Data is fixed length.We can increase data dynamically.
Arrays belongs to System.Array namespace.ArrayList belongs to System.Collection namespace.
Data can be added using array index.Data can be added using Add method.

Hope this blog will help you in understanding the Array and ArrayList. Happy Coding.

07 Dec, 2015

Marshaling native arrays back as managed arrays without copying

  • Yogeshwar Singh Chauhan
  • Business
  • no comments

I’ve come up with a trick that can be used in some very specific scenarios in order to avoid extra array copying when calling into native code from managed code (e.g. C#). This won’t usually work for regular P/Invokes into all your favorite Win32 APIs, but I’m hopeful it’ll be useful for someone somewhere. It’s not even evil! No hacks required.

Many native methods require the caller to allocate the array and specify its length, and then the callee fills it in or returns an error code indicating that the buffer is too small. The technique described in this post is not necessary for those, as they can already be used optimally without any copying.

Instead, let’s talk about the general problem if you’re calling a native method which does the array allocation and then returns it. You can’t use it as a “managed array” unless you copy it into a brand new managed array (don’t forget to free the native array). In other words, native { T* pArray; size_t length; } cannot be used as a simple managed T[] as-is (or even with modification!). The managed runtime didn’t allocate it, won’t recognize it, and there’s nothing you can do about it. Very few managed methods will accept a pointer and a length, and will require a managed array. This is particularly irksome when you want to use System.IO.Stream.Read() or Write() with bytes from a native-side buffer.

Paint.NET uses a library written in classic C called General Polygon Clipper (GPC), from The University of Manchester, to perform polygon clipping. This is used for, among other things, when you draw a selection with a mode such as add (union), subtract (exclude), intersect, and invert (“xor”). I blogged about this 4 years ago when version 3.35 was about to be released: using GPC made these operations immensely faster, and I saved a lot of time and headache by purchasing a commercial use license for the library and then integrating it into the Paint.NET code base. tl;dr: The algorithms for doing this are nontrivial and rife with special corner cases, and I’d been struggling to find enough sequential time to implement and debug it on my own.

Anyway, the data going into and coming out of GPC is an array of polygons. Each polygon is an array of points, each of which is just a struct containing X and Y as double-precision floating point values. To put it simply, it’s just a System.Windows.Point[][] (I actually use my own geometry primitives nowadays, but that’s another story, and it’s the same exact thing).

Getting this data into GPC from the managed side is easy. You pin every array, and then hand off the pinned pointers to GPC. Since you can’t use the “fixed” expression with a dynamic number of elements, I use GCHandle directly and stuff them all into GCHandle[] arrays for the duration of the native call. This is great because on the managed side I can work with regular ol’ managed arrays, and then send them off to GPC as “native arrays” by pinning them and using the pointers obtained from GCHandle.AddrOfPinnedObject().

Now, here’s the heart breaking part. GPC allocates the output polygon using good ol’ malloc*. So when I get the result back on the managed side, I must copy every single last one so that I can use it as a Point[] (a managed array). This ends up burning a lot of CPU time, and can cause virtual address space claustrophobia on 32-bit/x86 systems when working with complex selections (e.g. Magic Wand), as you must have enough memory for 2 copies of the result while you’re doing the copying. (Or you could free each native array after you copy it into a managed array, but that’s an optimization for another day, and isn’t as straightforward as you’d think because freeing the native memory requires another P/Invoke, and those add up, and so it might not actually be an optimization.)

But wait, there’s another way! Since the code for GPC is part of my build, I can modify it. So I added an extra parameter called gpc_vertex_calloc:

    typedef (gpc_vertex *)(__stdcall * gpc_vertex_calloc_fn)(int count);

    GPC_DLL_EXPORT(
void, gpc_polygon_clip)(
gpc_op                set_operation,
gpc_polygon          *subject_polygon,
gpc_polygon          *clip_polygon,
// result_polygon holds the arrays of gpc_vertex, aka System.Windows.Point)
gpc_polygon          *result_polygon,
gpc_vertex_calloc_fn  gpc_vertex_calloc);

(“gpc_vertex” is GPC’s struct that has the same layout as System.Windows.Point: X and Y, defined as a double.)

In short, I’ve changed GPC so that is uses an external allocator by passing in a function pointer it should use instead of malloc. And now if I want I can have it use malloc, HeapAlloc, VirtualAlloc, or even the secret sauce detailed below.

On the managed side, the interop delegate definition for gpc_vertex_calloc_fn gets defined as such:

    [UnmanagedFunctionPointer(CallingConvention.StdCall)]
public delegate IntPtr gpc_vertex_calloc_fn(int count);

And gpc_polygon_clip’s interop defintion is like so:

    [DllImport(“PaintDotNet.SystemLayer.Native.x86.dll”, CallingConvention = CallingConvention.StdCall)]
public static extern void gpc_polygon_clip(
[In] NativeConstants.gpc_op set_operation,
[In] ref NativeStructs.gpc_polygon subject_polygon,
[In] ref NativeStructs.gpc_polygon clip_polygon,
[In, Out] ref NativeStructs.gpc_polygon result_polygon,
[In] [MarshalAs(UnmanagedType.FunctionPtr)] NativeDelegates.gpc_vertex_calloc_fn gpc_vertex_calloc);

So, we’re half way there, and now we need to implement the allocator on the managed side.

    internal unsafe sealed class PinnedManagedArrayAllocator<T>
: Disposable
where T : struct
{
private Dictionary<IntPtr, T[]> pbArrayToArray;
private Dictionary<IntPtr, GCHandle> pbArrayToGCHandle;

        public PinnedManagedArrayAllocator()
{
this.pbArrayToArray = new Dictionary<IntPtr, T[]>();
this.pbArrayToGCHandle = new Dictionary<IntPtr, GCHandle>();
}

// (Finalizer is already implemented by the base class (Disposable))

        protected override void Dispose(bool disposing)
{
if (this.pbArrayToGCHandle != null)
{
foreach (GCHandle gcHandle in this.pbArrayToGCHandle.Values)
{
gcHandle.Free();
}

                this.pbArrayToGCHandle = null;
}

            this.pbArrayToArray = null;

            base.Dispose(disposing);
}

        // Pass a delegate to this method for “gpc_vertex_calloc_fn”. Don’t forget to use GC.KeepAlive() on the delegate!
public IntPtr AllocateArray(int count)
{
T[] array = new T[count];
GCHandle gcHandle = GCHandle.Alloc(array, GCHandleType.Pinned);
IntPtr pbArray = gcHandle.AddrOfPinnedObject();
this.pbArrayToArray.Add(pbArray, array);
this.pbArrayToGCHandle.Add(pbArray, gcHandle);
return pbArray;
}

        // This is what you would use instead of, e.g. Marshal.Copy()
public T[] GetManagedArray(IntPtr pbArray)
{
return this.pbArrayToArray[pbArray];
}
}

(“Disposable” is a base class which implements, you guessed it, IDisposable, while also ensuring that Dispose(bool) never gets called more than once. This is important for other places where thread safety is very important. This class is specifically not thread safe, but it should be reasonably easy to make it so.)

And that’s it! Well, almost. I’m omitting the guts of the interop code but nothing that should inhibit comprehension of this part of it. Also, the above code is not hardened for error cases, and should not be used as-is for anything running on a server or in a shared process. Oh, and I just noticed that my Dispose() method has an incorrect implementation, whereby it shouldn’t be using this.pbArrayToGCHandle, specifically it shouldn’t be foreach-ing on it, and should instead wrap that in its own IDisposable-implementing class … exercise for the reader? Or I can post a fix later if someone wants it.

After I’ve called gpc_polygon_clip, instead of copying all the arrays using something like System.Runtime.InteropServices.Marshal.Copy(), I just use GetManagedArray() and pass in the pointer that GPC retrieved from its gpc_vertex_calloc_fn, aka AllocateArray(). When I’m done, I dispose the PinnedManagedArrayAllocator and it unpins all the managed arrays. And this is much faster than making copies of everything.

Now, this isn’t the exact code I’m using. I’ve un-generalized it in the real code so I can allocate all of the arrays at once instead of incurring potentially hundreds of managed <—> native transitions for each allocation. The above implementation also doesn’t have a “FreeArray” method; I had one, but I ended up not needing it, so I removed it.

So the next time you find yourself calling into native code which either 1) allows you to specify an external allocator, or 2) is part of your build, and that 3) involves lots of data and thus lots of copying and wasted CPU time, you might just consider using the tactic above. Your users will thank you.

Creating a Web service with VB.NET
07 Dec, 2015

Creating a Web service with VB.NET

  • Yogeshwar Singh Chauhan
  • Business
  • no comments

Web services technology is based on HTTP, Simple Object Access Protocol (SOAP), and XML. Since Web services use open standards, calling Web services is fairly simple.

VB.NET allows you to use Web services as if they were entirely local objects since most of the marshaling between the client and the server is taking place in the background. This tip shows you how to create a simple Web service.

Example

I will create a Web service and add a Web function that will return the current machine’s IP address. Here are the steps for creating a Web service:

  1. Open Visual Studio.Net and select Create New Website under VB.NET.
  2. Select Web Service from the options listed.
  3. Once you get the code window open, add the following code:
<WebMethod()> _

Public Function GetMachineIPAddress() As String

Dim strHostName As String = “”

Dim strIPAddress As String = “”

Dim host As System.Net.IPHostEntry

strHostName = System.Net.Dns.GetHostName()

strIPAddress = System.Net.Dns.GetHostEntry(strHostName).HostName.ToString()

host = System.Net.Dns.GetHostEntry(strHostName)

Dim ip As System.Net.IPAddress

For Each ip In host.AddressList

Return ip.ToString()

Next

Return “”

    End Function

Once you debug the example, you will see a screen that looks like Figure A. Figure A

Figure A

It will list two available Web methods: the one that was generated (HelloWorld) and the one I created (GetMachineIPAddress). Click the link for GetMachineIPAddress, and you will see a screen allowing you to get more information about this method and invoking it (Figure B). Figure B

Figure B

Click the Invoke button, and the result will look like Figure C. Figure C

Figure C

 

04 Dec, 2015

C#: A Review on Generics

  • Yogeshwar Singh Chauhan
  • Business,Company,Dot Net,Events
  • no comments

I think a good C# developer needs to get a handle on .NET generics.  Most of the advanced features in C# deal with lots of generics and having a very good understanding of generics will help considerably, most especially when dealing with generic delegates.  So here in this post, we will review generics.

Generic type definitions can be methods, classes, structures, and interfaces.

// an example generic class
public class MyGenericClass<T>
{
    public T MyProperty;
}

 

 

The placeholders (e.g. <T>) are called generic type parameters, or type parameters.

You specify the actual types to substitute for the type parameters during instantiation.

// instantiate generic class with string type
MyGenericClass<string> c = new MyGenericClass<string>();

 

 

When instantiated, a generic type definition becomes a constructed generic type.

You can place limits or constraints on generic type parameters.

// limit type to value types except Nullable
public class MyGenericClass<T> where T : struct {/*...*/}

// limit type to reference types which can include classes,
//  interfaces, delegates, and array types
public class MyGenericClass<T> where T : class {/*...*/}

// limit type to types with public parameterless constructor
// must be specified last in multiple constraints
public class MyGenericClass<T> where T : new() {/*...*/}

// limit type to specified base class or to types derived from it
public class MyGenericClass<T> where T : MyBaseClass {/*...*/}

// limit type to specified interface or to types that implement it
public class MyGenericClass<T> where T : IMyInterface {/*...*/}

// limit type to specified type parameter    

// in a generic member function, it limits its type to the type 
//  parameter of the containing type
public class MyGenericClass<T>
{
    void MyMethod<U>(List<U> items) where U : T {/*...*/}
}

// in a generic class, it enforces an inheritance relationship
//  between the two type parameters
public class MyGenericClass<T, U> where U : T {/*...*/}

// type parameter can have multiple constraints 
//  and constraints can also be generics
//  and constraints can be applied to multiple type parameters
public class MyGenericClass<T, U> 
    where T : MyClass, IMyInterface, System.IComparable<T>, new()
    where U : struct
{
    // ...
}

 

 

A method is considered a generic method definition if it has two parameter lists: a list of type parameters enclosed in <> and a list of formal parameters enclosed in ().  A method belonging to a generic or non-generic type does not make the method generic or non-generic.  Only the existence of the two parameter lists will make the method generic, as in the example below.

public class MyClass
{
    // a generic method inside a non-generic class
    T MyGenericMethod<T>(T arg) {/*...*/}
}

 

 

A type nested in a generic type is considered by CLR to be generic even if it doesn’t have generic type parameters of its own.  When you instantiate a nested type, you need to specify the type arguments for all enclosing generic types.

// generic type
public class MyGenericType<T, U>
{
    // nested type
    public class MyNestedType
    {
        // ...
    }
}

// ... somewhere in code you instantiate the nested type   
MyGenericType<string, int>.MyNestedType nt = 
    new MyGenericType<string, int>.MyNestedType();

 

 

The following are some common generic collection counterparts provided by the .NET framework:

  • Dictionary<TKey, TValue> which is the generic version of Hashtable.  It uses KeyValuePair<TKey, TValue> for enumeration instead of DictionaryEntry.
  • List<T> which is the generic version of ArrayList.
  • Queue<T> and Stack<T> which is the generic versions of collections with same names.
  • SortedList<TKey, TValue> which is a hybrid of a dictionary and list, just like it’s nongeneric version of same name.
  • SortedDictionary<TKey, TValue> which is a pure dictionary, and LinkedList<T>.  Both don’t have nongeneric versions.
  • Collection<T> which is a base class for generating custom collection types, ReadOnlyCollection<T> which provides read-only collection from any type implementing IList<T>, and KeyedCollection<TKey, TItem> for storing objects containing their own keys.

 

There are also generic interface counterparts for ordering and equality comparisons and for shared collection functionality:

  • System.IComparable<T> and System.IEquatable<T> which define methods for ordering comparisons and equality comparisons.
  • IComparer<T> and IEqualityComparer<T> in the System.Collections.Generic namespace, which offer alternative way for types that do not implement System.IComparable<T> and System.IEquatable<T>.  They are used by methods and constructors of many of the generic collection classes.  An example would be passing a generic IComparer<T> object to the constructor of SortedDictionary<TKey, TValue> to specify a sort order.  Generic classes Comparer<T> and EqualityComparer<T> are their base class implementations.
  • ICollection<T> which provides basic functionality for adding, removing, copying, and enumerating elements in a generic collection type.  It inherits from IEnumerable<T> and the nongeneric IEnumerable.
  • IList<T> which extends ICollection<T> with methods for indexed retrieval.
  • IDictionary<TKey, TValue> which extends ICollection<T> with methods for keyed retrieval.  Generic dictionary types also inherit from nongeneric IDictionary.
  • IEnumerable<T> which provides a generic enumerator structure used by foreach.  It inherits from nongeneric IEnumerator because MoveNext and Reset methods appear only on the nongeneric interface.  This means consumer of the nongeneric interface can also consume the generic interface because the generic interface provides for nongeneric implementation.

 

You also have generic delegates in .NET framework.  An example is the EventHandler<TEventArgs> which you can use in handling events with custom event arguments.  No need to declare your own delegate type for the event.  If you need to brush up on events and delegates, see my post on raising events and nongeneric delegates.

public event EventHandler<PublishedEventArgs> Published;

 

 

There are also a bunch of useful generic delegates available for manipulating arrays and lists

  • Action<T> which allows you to perform action on an element by passing an Action<T> delegate instance and an array to the generic method Array.ForEach<T>.  You can also pass an Action<T> delegate instance to the nongeneric method List<T>.ForEach.
  • Predicate<T> which allows you to specify a search criteria to Array’s Exists<T>, Find<T>, FindAll<T>, and so on and also to List<T>’s Exists, Find, FindAll, and so on.
  • Comparsion<T> which allows you to provide a sort order.
  • Converter<TInput, TOutput> which allows you to convert between two types of arrays or lists.

 

Ok so that’s all we have for generics.  Just a review of the basics that a C# developer need to know.

03 Dec, 2015

SQL SERVER – Two Methods to Retrieve List of Primary Keys and Foreign Keys of Database

  • Yogeshwar Singh Chauhan
  • Business,Company,Dot Net,Events
  • no comments

There are two different methods of retrieving the list of Primary Keys and Foreign Keys from database.

Method 1: INFORMATION_SCHEMA

SELECT
DISTINCT
Constraint_Name AS [Constraint],
Table_Schema AS [Schema],
Table_Name AS [TableName]
FROM INFORMATION_SCHEMA.KEY_COLUMN_USAGE
GO

Method 2: sys.objects

SELECT OBJECT_NAME(OBJECT_ID) AS NameofConstraint,
SCHEMA_NAME(schema_id) AS SchemaName,
OBJECT_NAME(parent_object_id) AS TableName,
type_desc AS ConstraintType
FROM sys.objects
WHERE type_desc IN ('FOREIGN_KEY_CONSTRAINT','PRIMARY_KEY_CONSTRAINT')
GO

I am often asked about my preferred method of retrieving list of Primary Keys and Foreign Keys from database. I have a standard answer. I prefer method 3, which is querying sys database. The reason is very simple. sys. schema always provides more information and all the data can be retrieved in our preferred fashion with the preferred filter.

Let us look at the example we have on our hand. When Information Schema is used, we will not be able to discern between primary key and foreign key; we will have both the keys together. In the case of sys schema, we can query the data in our preferred way and can join this table to another table, which can retrieve additional data from the same.

Let us play a small puzzle here. Try to modify both the scripts in such a way that we are able to see the original definition of the key, that is, create a statement for this primary key and foreign key.

If I get an appropriate answer from my readers, I will publish the solution on this blog with due credit.

Understanding LINQ to SQL  Object-Relational Mapping
03 Dec, 2015

Understanding LINQ to SQL Object-Relational Mapping

  • Yogeshwar Singh Chauhan
  • Business,Company,Dot Net
  • no comments

According to Wikipedia, Object-relational mapping is:

a programming technique for converting data between incompatible type systems in relational databases and object-oriented programming languages.

This is the LINQ to SQL sample code at the beginning of this series:

using (NorthwindDataContext database = new NorthwindDataContext())
{
    var results = from product in database.Products
                  where product.Category.CategoryName == "Beverages"
                  select new
                  {
                      product.ProductName,
                      product.UnitPrice
                  };
    foreach (var item in results)
    {
        Console.WriteLine(
            "{0}: {1}", 
            item.ProductName, 
            item.UnitPrice.ToString(CultureInfo.InvariantCulture));
    }
}

According to this post, the above query expression will be compiled to query methods:

var results = database.Products.Where(product => product.Category.CategoryName == "Beverages")
                               .Select(product => new
                                                      {
                                                          product.ProductName,
                                                          product.UnitPrice
                                                      });

It is querying the ProductName and UnitPrice fields of the Products table in the Northwind database, which belong to the specified CategoryName. To work with SQL Server representations (fields, tables, databases) in C# representations (object models), the mappings between SQL representations and C# representations need to be created. LINQ to SQL provides an Object-relational mapping designer tool to create those objects models automatically.

Create C# models from SQL schema

The easiest way of modeling is to use Visual Studio IDE. This way works with:

  • SQL Server 2000
  • SQL Server 2005
  • SQL Server 2008
  • SQL Server 2008 R2

Take the Northwind database as an example. First, setup a data connection to the Northwind database:

image

Then, create a “LINQ to SQL Classes” item to the project:

image

By creating a Northwind.dbml file, the O/R designer is opened:

image

Since the above query works with the Products table and the Categories table, just drag the 2 tables and drop to the O/R designer:

image

In the designer, the modeling is done. Please notice that the foreign key between Categories table and Products table is recognized, and the corresponding association is created in the designer.

Now the object models are ready to rock. Actually the designer has automatically created the following C# code:

  • Category class: represents each record in Categories table;
    • CategoryID property (an int): represents the CategoryID field; So are the other properties shown above;
    • Products propery (a collection of Product object): represents the associated many records in Products table
  • Product class: represents each record in Products table;
    • ProductID property (an int): represents the ProductID field; So are the other properties shown above;
    • Category propery (a Category object): represents the associated one records in Products table;
  • NorthwindDataContext class: represents the Northwind database;
    • Categories property (a collection of the Category objects): represents the Categories table;
    • Products property (a collection of the Product objects): represents the Products table;

Besides, database, tables, fields, other SQL stuff can also be modeled by this O/R designer:

image

SQL representationC# representationSample
DatabaseDataContext derived classNothwindDataContext
Table, ViewDataContext derived class’s propertyNothwindDataContext.Categories
RecordEntity classCategory
FieldEntity class’s propertyCategory.CategoryName
Foreign keyAssociation between entity classesCategory.Products
Stored procedure, functionDataContext derived class’s methodNothwindDataContext.SalesByCategory()

Another way to generate the models is to use the command line tool SqlMetal.exe. Please check MSDN for details of code generation.

And, please notice that, the Category entity class is generated from the Categories table. Here plural name is renamed to singular name, because a Category object is the mapping of one record of Categories table. This can be configured in Visual Studio:

image

Implement the mapping

Now take a look at how the SQL representations are mapped to C# representations.

The Northwind.dbml is nothing but an XML file:

<?xml version="1.0" encoding="utf-8"?>
<!-- [Northwind] database is mapped to NorthwindDataContext class. -->
<Database Name="Northwind" Class="NorthwindDataContext" xmlns="http://schemas.microsoft.com/linqtosql/dbml/2007">
    <!-- Connection string -->
    <Connection Mode="WebSettings" ConnectionString="Data Source=qablog.qaitdevlabs.com;Initial Catalog=Northwind;Integrated Security=True" SettingsObjectName="System.Configuration.ConfigurationManager.ConnectionStrings" SettingsPropertyName="NorthwindConnectionString" Provider="System.Data.SqlClient" />

    <!-- Categories property is a member of NorthwindDataContext class. -->
    <Table Name="dbo.Categories" Member="Categories">
        <!-- [Categories] table is mapped to Category class. -->
        <Type Name="Category">
            <!-- [CategoryID] (SQL Int) field is mapped to CategoryID property (C# int). -->
            <Column Name="CategoryID" Type="System.Int32" DbType="Int NOT NULL IDENTITY" IsPrimaryKey="true" IsDbGenerated="true" CanBeNull="false" />
            <!-- [CategoryName] (SQL NVarChar(15)) field is mapped to CategoryName property (C# string). -->
            <Column Name="CategoryName" Type="System.String" DbType="NVarChar(15) NOT NULL" CanBeNull="false" />
            <!-- Other fields. -->
            <Column Name="Description" Type="System.String" DbType="NText" CanBeNull="true" UpdateCheck="Never" />
            <Column Name="Picture" Type="System.Data.Linq.Binary" DbType="Image" CanBeNull="true" UpdateCheck="Never" />
            <!-- [Categories] is associated with [Products] table via a foreign key.
            So Category class has a Products peoperty to represent the associated many Product objects. -->
            <Association Name="Category_Product" Member="Products" ThisKey="CategoryID" OtherKey="CategoryID" Type="Product" />
        </Type>
    </Table>

    <!-- Products property is a member of NorthwindDataContext class. -->
    <Table Name="dbo.Products" Member="Products">
        <!-- [Products] table is mapped to Product class. -->
        <Type Name="Product">
            <!-- Fields. -->
            <Column Name="ProductID" Type="System.Int32" DbType="Int NOT NULL IDENTITY" IsPrimaryKey="true" IsDbGenerated="true" CanBeNull="false" />
            <Column Name="ProductName" Type="System.String" DbType="NVarChar(40) NOT NULL" CanBeNull="false" />
            <Column Name="SupplierID" Type="System.Int32" DbType="Int" CanBeNull="true" />
            <Column Name="CategoryID" Type="System.Int32" DbType="Int" CanBeNull="true" />
            <Column Name="QuantityPerUnit" Type="System.String" DbType="NVarChar(20)" CanBeNull="true" />
            <Column Name="UnitPrice" Type="System.Decimal" DbType="Money" CanBeNull="true" />
            <Column Name="UnitsInStock" Type="System.Int16" DbType="SmallInt" CanBeNull="true" />
            <Column Name="UnitsOnOrder" Type="System.Int16" DbType="SmallInt" CanBeNull="true" />
            <Column Name="ReorderLevel" Type="System.Int16" DbType="SmallInt" CanBeNull="true" />
            <Column Name="Discontinued" Type="System.Boolean" DbType="Bit NOT NULL" CanBeNull="false" />
            <!-- [Products] is associated with [Products] table via a foreign key.
            So Product class has a Category peoperty to represent the associated one Category object. -->
            <Association Name="Category_Product" Member="Category" ThisKey="CategoryID" OtherKey="CategoryID" Type="Category" IsForeignKey="true" />
        </Type>
    </Table>
</Database>

It describes how the SQL stuff are mapped to C# stuff.

A Northwind.dbml.layout file is created along with the dbml. It is also an XML, describing how the O/R designer should visualize the objects models:

<?xml version="1.0" encoding="utf-8"?>
<ordesignerObjectsDiagram dslVersion="1.0.0.0" absoluteBounds="0, 0, 11, 8.5" name="Northwind">
    <DataContextMoniker Name="/NorthwindDataContext" />
    <nestedChildShapes>
        <!-- Category class -->
        <classShape Id="81d67a31-cd80-4a91-84fa-5d4dfa2e8694" absoluteBounds="0.75, 1.5, 2, 1.5785953776041666">
            <DataClassMoniker Name="/NorthwindDataContext/Category" />
            <nestedChildShapes>
                <!-- Properties -->
                <elementListCompartment Id="a261c751-8ff7-471e-9545-cb385708d390" absoluteBounds="0.765, 1.96, 1.9700000000000002, 1.0185953776041665" name="DataPropertiesCompartment" titleTextColor="Black" itemTextColor="Black" />
            </nestedChildShapes>
        </classShape>

        <!-- Product class -->
        <classShape Id="59f11c67-f9d4-4da9-ad0d-2288402ec016" absoluteBounds="3.5, 1, 2, 2.7324039713541666">
            <DataClassMoniker Name="/NorthwindDataContext/Product" />
            <nestedChildShapes>
                <!-- Properties -->
                <elementListCompartment Id="6c1141a2-f9a9-4660-8730-bed7fa15bc27" absoluteBounds="3.515, 1.46, 1.9700000000000002, 2.1724039713541665" name="DataPropertiesCompartment" titleTextColor="Black" itemTextColor="Black" />
            </nestedChildShapes>
        </classShape>

        <!-- Association arrow -->
        <associationConnector edgePoints="[(2.75 : 2.28929768880208); (3.5 : 2.28929768880208)]" fixedFrom="Algorithm" fixedTo="Algorithm">
            <AssociationMoniker Name="/NorthwindDataContext/Category/Category_Product" />
            <nodes>
                <!-- From Category class -->
                <classShapeMoniker Id="81d67a31-cd80-4a91-84fa-5d4dfa2e8694" />
                <!-- To Product class -->
                <classShapeMoniker Id="59f11c67-f9d4-4da9-ad0d-2288402ec016" />
            </nodes>
        </associationConnector>
    </nestedChildShapes>
</ordesignerObjectsDiagram>

A Northwind.designer.cs is also created, containing the auto generated C# code.

This is how the NorthwindDataContext looks like:

[Database(Name = "Northwind")]
public partial class NorthwindDataContext : DataContext
{
    public Table<Category> Categories
    {
        get
        {
            return this.GetTable<Category>();
        }
    }

    public Table<Product> Products
    {
        get
        {
            return this.GetTable<Product>();
        }
    }
}

And this is the Category class:

[Table(Name = "dbo.Categories")]
public partial class Category : INotifyPropertyChanging, INotifyPropertyChanged
{
    private int _CategoryID;

    private EntitySet<Product> _Products;

    [Column(Storage = "_CategoryID", AutoSync = AutoSync.OnInsert, 
        DbType = "Int NOT NULL IDENTITY", IsPrimaryKey = true, IsDbGenerated = true)]
    public int CategoryID
    {
        get
        {
            return this._CategoryID;
        }
        set
        {
            if ((this._CategoryID != value))
            {
                this.OnCategoryIDChanging(value);
                this.SendPropertyChanging();
                this._CategoryID = value;
                this.SendPropertyChanged("CategoryID");
                this.OnCategoryIDChanged();
            }
        }
    }

    // Other properties.

    [Association(Name = "Category_Product", Storage = "_Products", 
        ThisKey = "CategoryID", OtherKey = "CategoryID")]
    public EntitySet<Product> Products
    {
        get
        {
            return this._Products;
        }
        set
        {
            this._Products.Assign(value);
        }
    }
}

The Products looks similar.

Customize the mapping

Since the mapping info are simply stored in the XML file and C# code, they can be customized in the O/R designer easily:

image

After renaming Category class to CategoryEntity, the XML and C# is refined automatically:

<?xml version="1.0" encoding="utf-8"?>
<Database Name="Northwind" Class="NorthwindDataContext" xmlns="http://schemas.microsoft.com/linqtosql/dbml/2007">
    <Table Name="dbo.Categories" Member="CategoryEntities">
        <Type Name="CategoryEntity">
            <!-- Fields -->
        </Type>
    </Table>
    <Table Name="dbo.Products" Member="Products">
        <Type Name="Product">
            <!-- Fields -->
            <Association Name="Category_Product" Member="CategoryEntity" Storage="_Category" ThisKey="CategoryID" OtherKey="CategoryID" Type="CategoryEntity" IsForeignKey="true" />
        </Type>
    </Table>
</Database>

and

[Database(Name = "Northwind")]
public partial class NorthwindDataContext : DataContext
{
    public Table<CategoryEntity> CategoryEntities { get; }
}

[Table(Name = "dbo.Categories")]
public partial class CategoryEntity : INotifyPropertyChanging, INotifyPropertyChanged
{
}

[Table(Name = "dbo.Products")]
public partial class Product : INotifyPropertyChanging, INotifyPropertyChanged
{
    [Association(Name = "Category_Product", Storage = "_Category",
        ThisKey = "CategoryID", OtherKey = "CategoryID", IsForeignKey = true)]
    public CategoryEntity CategoryEntity { get; set; }
}

Properties, associations, and inheritances and also be customized:

image

For example, The ProductID property can be renamed to ProductId to be compliant to .NET Framework Design Guidelines.

More options are available to customize the data context, entities, and properties:

image

Please notice this mapping is one way mapping, from SQL Server to C#. When the mapping information is changed in O/R designer, SQL Server is not affected at all.

And, LINQ to SQL is designed to provide a simple O/R mapping, not supporting advenced functionalities, like multi-table inheritance, etc. According to MSDN:

The single-table mapping strategy is the simplest representation of inheritance and provides good performance characteristics for many different categories of queries.

Please check this link for more details.

Work with the models

The auto generated models are very easy and extensible.

Partial class

All the generated C# classes are partial classes. For example, it is very easy to add a NorthwindDataContext,cs file and a Category.cs file to the project, and write the extension code.

Partial method

There are also a lot of partial method in the generated code:

[Database(Name = "Northwind")]
public partial class NorthwindDataContext : DataContext
{
    #region Extensibility Method Definitions

    partial void OnCreated();
    partial void InsertCategory(Category instance);
    partial void UpdateCategory(Category instance);
    partial void DeleteCategory(Category instance);
    partial void InsertProduct(Product instance);
    partial void UpdateProduct(Product instance);
    partial void DeleteProduct(Product instance);

    #endregion
}

For example, the OnCreated() can be implemented in the NorthwindDataContext,cs:

public partial class NorthwindDataContext
{
    // OnCreated will be invoked by constructors.
    partial void OnCreated()
    {
        // The default value is 30 seconds.
        this.CommandTimeout = 40;
    }
}

When the Northwind is constructed, the OnCreated() is invoked, and the custom code is executed.

So are the entities:

[Table(Name = "dbo.Categories")]
public partial class Category : INotifyPropertyChanging, INotifyPropertyChanged
{
    #region Extensibility Method Definitions

    partial void OnLoaded();
    partial void OnValidate(ChangeAction action);
    partial void OnCreated();
    partial void OnCategoryIDChanging(int value);
    partial void OnCategoryIDChanged();
    partial void OnCategoryNameChanging(string value);
    partial void OnCategoryNameChanged();
    partial void OnDescriptionChanging(string value);
    partial void OnDescriptionChanged();
    partial void OnPictureChanging(Binary value);
    partial void OnPictureChanged();

    #endregion
}

For example, the OnValidated() is very useful for the data correction:

[Table(Name = "dbo.Categories")]
public partial class Category
{
    partial void OnValidate(ChangeAction action)
    {
        switch (action)
        {
            case ChangeAction.Delete:
                // Validates the object when deleted.
                break;
            case ChangeAction.Insert:
                // Validates the object when inserted.
                break;
            case ChangeAction.None:
                // Validates the object when not submitted.
                break;
            case ChangeAction.Update:
                // Validates the object when updated.
                if (string.IsNullOrWhiteSpace(this._CategoryName))
                {
                    throw new ValidationException("CategoryName is invalid.");
                }
                break;
            default:
                break;
        }
    }
}

When the category object (representing a record in Categories table) is updated, the custom code checking the CategoryName will be executed.

And, because each entity class’s Xxx property’s setter involves OnXxxChanging() partial method:

[Table(Name = "dbo.Categories")]
public partial class CategoryEntity : INotifyPropertyChanging, INotifyPropertyChanged
{
    [Column(Storage = "_CategoryName", DbType = "NVarChar(15) NOT NULL", CanBeNull = false)]
    public string CategoryName
    {
        get
        {
            return this._CategoryName;
        }
        set
        {
            if ((this._CategoryName != value))
            {
                this.OnCategoryNameChanging(value);
                this.SendPropertyChanging();
                this._CategoryName = value;
                this.SendPropertyChanged("CategoryName");
                this.OnCategoryNameChanged();
            }
        }
    }
}

Validation can be also done in this way:

public partial class CategoryEntity
{
    partial void OnCategoryNameChanging(string value)
    {
        if (string.IsNullOrWhiteSpace(value))
        {
            throw new ArgumentOutOfRangeException("value");
        }
    }
}

INotifyPropertyChanging and INotifyPropertyChanged interfaces

Each auto generated entity class implements INotifyPropertyChanging and INotifyPropertyChanged interfaces:

namespace System.ComponentModel
{
    public interface INotifyPropertyChanging
    {
        event PropertyChangingEventHandler PropertyChanging;
    }

    public interface INotifyPropertyChanged
    {
        event PropertyChangedEventHandler PropertyChanged;
    }
}

For example, in the above auto generated CategoryName code, after setting the CategoryName, SendPropertyChanged() is invoked, passing the propery name “CategoryName” as argument:

[Table(Name = "dbo.Categories")]
public partial class CategoryEntity : INotifyPropertyChanging, INotifyPropertyChanged
{
    public event PropertyChangedEventHandler PropertyChanged;

    protected virtual void SendPropertyChanged(String propertyName)
    {
        if (this.PropertyChanged != null)
        {
            this.PropertyChanged(this, new PropertyChangedEventArgs(propertyName));
        }
    }
}

This is very useful to track changes of the entity object:

using (NorthwindDataContext database = new NorthwindDataContext())
{
    Category category = database.Categories.Single(item => item.CategoryName = "Beverages");
    category.PropertyChanged += (_, e) =>
        {
            Console.Write("Propery {0} is changed", e.PropertyName);
        };

    // Work with the category object.
    category.CategoryID = 100;
    // ...
}

And this is used for change tracking by DataContext, which will be explained later.

Programmatically access the mapping information

The mapping information is stored in DataContext.Mapping as a MetaModel object. Here is an example:

public static class DataContextExtensions
{
    public static Type GetEntityType(this DataContext database, string tableName)
    {
        return database.Mapping.GetTables()
                               .Single(table => table.TableName.Equals(
                                   tableName, StringComparison.Ordinal))
                               .RowType
                               .Type;
    }
}

The method queries the mapping information with the table name, and returns the entity type:

using (NorthwindDataContext database = new NorthwindDataContext())
{
    Type categoryType = database.GetEntityType("dbo.Categories");
}

Create SQL schema from C# models

Usually, many people design the SQL database first, then model it with the O/R designer, and write code to work with the C# object models. But this is not required. It is totally Ok to create POCO models first without considering the SQL stuff:

public partial class Category
{
    public int CategoryID { get; set; }

    public string CategoryName { get; set; }

    public EntitySet<Product> Products { get; set; }
}

Now it is already able to start coding with this kind of models.

Later, there are 2 ways to integrate the C# program with SQL Server database:

  • Generate object models from designed SQL Server database;
  • Decorate POCO models with mapping attributes, Invoke CreateDatabase() method of DataContext to create the expected database schema in SQL Server.

For example, the C# models can be polluted with O/R mapping knowledge like this:

[Table(Name = "Categories")]
public class Category
{
    [Column(DbType = "Int NOT NULL IDENTITY", IsPrimaryKey = true)]
    public int CategoryId { get; set; }

    [Column(DbType = "NVarChar(15) NOT NULL")]
    public string CategoryName { get; set; }

    [Association(Name = "Category_Products",
        ThisKey = "CategoryId", OtherKey = "CategoryId")]
    public EntitySet<Product> Products { get; set; }
}

[Table(Name = "Products")]
public class Product
{
    [Column(DbType = "Int NOT NULL IDENTITY", IsPrimaryKey = true)]
    public int ProductId { get; set; }

    [Column(DbType = "NVarChar(40) NOT NULL")]
    public string ProductName { get; set; }

    [Column(DbType = "Int")]
    public int CategoryId { get; set; }

    [Association(Name = "Category_Products", IsForeignKey = true,
        ThisKey = "CategoryId", OtherKey = "CategoryId")]
    public Category Category { get; set; }
}

[Database(Name = "SimpleNorthwind")]
public class SimpleNorthwindDataContext : DataContext
{
    public SimpleNorthwindDataContext(IDbConnection connection)
        : base(connection)
    {
    }

    public Table<Category> Categories { get; set; }

    public Table<Product> Products { get; set; }
}

Now it is ready to create database schema in SQL server:

using (SimpleNorthwindDataContext database = new SimpleNorthwindDataContext(new SqlConnection(
    @"Data Source=qablog.qaitdevlabs.com;Initial Catalog=SimpleNorthwind;Integrated Security=True")))
{
    if (database.DatabaseExists())
    {
        database.DeleteDatabase();
    }

    database.CreateDatabase();
}

Isn’t this easy? This is the generated SimpleNorthwind database in SQL Server:

image

Page 1 of 5 123…5 Next >

Site Categories

  • Accessibility Testing (29)
  • Automation Testing (27)
  • Banking Application Testing (2)
  • Blockchain (2)
  • Blogs (378)
  • Business (44)
  • Case Studies (37)
  • Cloud Testing (5)
  • Company (16)
  • Compatibility Testing (1)
  • DevLabs Expert Group (25)
  • DevOps (2)
  • Dot Net (27)
  • E-Learning testing (3)
  • Events (6)
  • Fun at Devlabs (1)
  • Functional Testing (4)
  • Healthcare App Testing (10)
  • Innovation (5)
  • Java (3)
  • Job Openings (31)
  • Mobile Testing (20)
  • News (144)
  • News & Updates (7)
  • Open Source (9)
  • Our-Team (9)
  • Performance Testing (24)
  • Press Releases (37)
  • QA Thought Leadership (3)
  • Salesforce App Development (2)
  • Security Testing (16)
  • Software Testing (37)
  • Testimonials (24)
  • Translation & Globalization Testing (10)
  • Uncategorized (3)
  • Usability Testing (1)
  • Webinars (26)
  • White Papers (35)
  • Popular
  • Recent
  • Tags
  • Zend Framework April 16, 2013
  • Effective Regression Testing Strategy for Localized Applications Effective Regression Testing Strategy for Localized Applications March 23, 2015
  • Moving from a commercial to an open source performance testing tool August 12, 2015
  • 3 Tier Architecture in .Net Framework March 21, 2013
  • Zend at QAIT Devlabs March 26, 2013
  • Key Focus Areas while Testing a Healthcare App Key Focus Areas while Testing a Healthcare App September 18, 2020
  • Need for the Right Performance Testing Strategy for your Mobile App Need for the Right Performance Testing Strategy for your Mobile App September 12, 2020
  • Key Points to Remember Before Starting Salesforce Customization Key Points to Remember Before Starting Salesforce Customization September 8, 2020
  • Top 5 Automation Testing Tools for Mobile Applications Top 5 Automation Testing Tools for Mobile Applications September 2, 2020
  • Improve Salesforce Application Performance Leveraging Platform Cache using Lightning Web Component Improve Salesforce Application Performance Leveraging Platform Cache using Lightning Web Component August 28, 2020
  • Jobs - 13
  • Hiring - 13
  • mobile app testing - 8
  • performance testing - 7
  • accessibility-testing - 6
  • #AccessibilityTesting - 6
  • #PerformanceTesting - 6
  • automation testing - 5
  • accessibility - 4
  • #PerformanceTestingServices - 4
  • Performance Testing Services - 4
  • mobile - 3
  • testing - 3
  • functional testing - 3
  • agile cycle - 3
  • DevOps - 3
  • performance - 3
  • software testing services - 3
  • data analytics - 3
  • #SoftwareTesting - 3
  • #TestAutomation - 3
  • #AppSecurity - 3
  • #SecureBankingApps - 3
  • #TestingBankingApplications - 3
  • #SoftwareTestingStrategy - 3

Site Archives

  • September 2020 (4)
  • August 2020 (9)
  • July 2020 (15)
  • June 2020 (9)
  • May 2020 (13)
  • April 2020 (13)
  • March 2020 (23)
  • February 2020 (7)
  • January 2020 (18)
  • December 2019 (9)
  • November 2019 (10)
  • October 2019 (8)
  • September 2019 (9)
  • August 2019 (6)
  • July 2019 (4)
  • June 2019 (7)
  • May 2019 (18)
  • April 2019 (15)
  • March 2019 (5)
  • February 2019 (1)
  • January 2019 (5)
  • December 2018 (3)
  • October 2018 (4)
  • August 2018 (4)
  • July 2018 (15)
  • June 2018 (1)
  • May 2018 (3)
  • April 2018 (7)
  • March 2018 (5)
  • February 2018 (15)
  • January 2018 (3)
  • December 2017 (8)
  • November 2017 (13)
  • October 2017 (19)
  • September 2017 (13)
  • August 2017 (11)
  • July 2017 (7)
  • June 2017 (6)
  • May 2017 (5)
  • April 2017 (2)
  • March 2017 (6)
  • January 2017 (3)
  • December 2016 (7)
  • October 2016 (3)
  • September 2016 (3)
  • August 2016 (6)
  • July 2016 (4)
  • June 2016 (3)
  • May 2016 (6)
  • April 2016 (3)
  • March 2016 (7)
  • February 2016 (3)
  • January 2016 (3)
  • December 2015 (20)
  • November 2015 (2)
  • October 2015 (28)
  • September 2015 (4)
  • August 2015 (2)
  • July 2015 (14)
  • June 2015 (2)
  • May 2015 (2)
  • April 2015 (5)
  • March 2015 (18)
  • February 2015 (11)
  • January 2015 (4)
  • December 2014 (3)
  • November 2014 (4)
  • October 2014 (6)
  • September 2014 (7)
  • August 2014 (6)
  • July 2014 (7)
  • June 2014 (6)
  • May 2014 (4)
  • April 2014 (7)
  • March 2014 (7)
  • February 2014 (8)
  • January 2014 (7)
  • December 2013 (3)
  • November 2013 (6)
  • October 2013 (6)
  • September 2013 (10)
  • August 2013 (3)
  • July 2013 (4)
  • June 2013 (6)
  • May 2013 (3)
  • April 2013 (12)
  • March 2013 (6)
  • February 2013 (2)
  • January 2013 (1)
  • December 2012 (2)
  • November 2012 (3)
  • October 2012 (3)
  • September 2012 (5)
  • August 2012 (2)
  • July 2012 (6)
  • June 2012 (1)
  • May 2012 (2)
  • April 2012 (3)
  • March 2012 (8)
  • February 2012 (4)
  • January 2012 (3)
  • December 2011 (1)
  • November 2011 (4)
  • October 2011 (3)
  • September 2011 (2)
  • August 2011 (3)
  • June 2011 (4)
  • May 2011 (1)
  • April 2011 (4)
  • February 2011 (1)
  • January 2011 (1)
  • October 2010 (2)
  • August 2010 (4)
  • July 2010 (2)
  • June 2010 (3)
  • May 2010 (3)
  • April 2010 (1)
  • March 2010 (5)
  • February 2010 (1)
  • January 2010 (2)
  • December 2009 (3)
  • November 2009 (1)
  • October 2009 (2)
  • July 2009 (1)
  • June 2009 (2)
  • May 2009 (2)
  • March 2009 (2)
  • February 2009 (4)
  • December 2008 (2)
  • November 2008 (1)
  • October 2008 (1)
  • September 2008 (1)
  • August 2008 (2)
  • May 2008 (1)
  • February 2008 (1)
  • September 2007 (1)
  • August 2007 (1)
  • May 2007 (2)
  • June 2006 (1)

Tag Cloud

#AccessibilityTesting #AppSecurity #AutomationTesting #MobileAppTesting #MobileTesting #PerformanceTesting #PerformanceTestingServices #SecureBankingApps #SoftwareTestAutomation #SoftwareTesting #SoftwareTestingStrategy #TestAutomation #TestingBankingApplications .NEt accessibility accessibility-testing agile cycle automation automation testing BigData cloud computing cloud testing data analytics DevOps education functional testing functional testing services globalization Hiring Jobs localization testing mobile mobile app testing Mobile Testing Offshore QA Testing performance performance testing Performance Testing Services Security testing services Selenium Test Automation software testing software testing services technology testing xAPI

Post Calendar

January 2021
MTWTFSS
« Sep  
 123
45678910
11121314151617
18192021222324
25262728293031

About QA InfoTech

Q A QA InfoTech is a C M M i CMMi Level III and I S O ISO 9001: 2015, I S O ISO 20000-1:2011, I S O ISO 27001:2013 certified company. We are one of the reputed outsourced Q A QA testing vendors with years of expertise helping clients across the globe. We have been ranked amongst the 100 Best Companies to work for in 2010 and 2011 & 50 Best Companies to work for in 2012 , Top 50 Best IT & IT-BMP organizations to work for in India in 2014, Best Companies to work for in IT & ITeS 2016 and a certified Great Place to Work in 2017-18. These are studies conducted by the Great Place to Work® Institute. View More

Get in Touch

Please use Tab key to navigate between different form fields.

This site is automatically   protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Services

  • Functional Testing
  • Automation Testing
  • Mobile Testing
  • Performance Testing
  • Accessibility Testing
  • Security Testing
  • Localization Testing
  • Cloud Testing
  • Quality Consulting

Useful Links

  • Blogs
  • Downloads
  • Case Studies
  • Webinars
  • Team
  • Pilot Testing
  • Careers
  • QA TV
  • Contact

Office Locations

Michigan, USA
Toronto, Canada
Noida, INDIA ( HQ )
Bengaluru, INDIA
Michigan, USA

  • 32985 Hamilton Court East, Suite 121, Farmington Hills, Michigan, 48334
  • +1-469-759-7848
  • info@qainfotech.com

Toronto, Canada

  • 6 Forest Laneway, North York, Ontario, M2N5X9
  • info@qainfotech.com

Noida, INDIA ( HQ )

  • A-8, Sector 68 Noida, Uttar Pradesh, 201309
  • +91-120-6101-805 / 806
  • info@qainfotech.com

Bengaluru, INDIA

  • RMZ Ecoworld, Outer Ring Road, Bellandur, Bengaluru, Karnataka, 560103
  • +91-95600-00079
  • info@qainfotech.com

Copyright ©2020 qainfotech.com. All rights reserved | Privacy Policy | Disclaimer

Scroll
QA InfoTech logo
  • About
    ▼
    • Team
    • Values and Culture
    • Overview
    • QA InfoTech Foundation
    • Careers
  • Services
    ▼
    • Software Development
      ▼
      • eLearning
      • Data Sciences
      • Accessibility Development
      • Mobility Solutions
      • Web Development
      • Front End Frameworks
      • Salesforce Development
      • Cloud Solutions
      • Enterprise Content Management
      • Odoo
      • ServiceNow
      • AS400
    • Functional Testing Services
    • Automation Testing Services & Tools
    • Mobile Testing Services
    • Performance Testing Services
    • Accessibility Testing Services
    • Usability Testing
    • Security Testing
    • Translation & Globalization Testing
    • Courseware & Content Testing
    • Crowdsourced Testing
    • Cloud Testing
    • Digital Assurance
    • Data Sciences and Analytics
    • SAP Testing
    • Selenium Test Automation
    • Blockchain Applications Testing
  • Verticals
    ▼
    • e-Learning
    • Health Care
    • Retail
    • Publishing
    • Media
    • Government
    • BFSI
    • Travel
    • OpenERP
  • Knowledge Center
    ▼
    • Case Studies
    • White Paper
    • Webinars
    • Blogs
  • WebinarsNew
  • News
    ▼
    • News
    • Press Release
  • Contact
  • Get a Quote
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.Accept CookiesPrivacy policy