Nodejs tools

nodemon:

By default node process needs to be restarted for picking any change in your JS file. Thus, every time there is a change, you will need to stop node process and start it again.

You can automate this task by using nodemon npm package. This package will monitor any changes and it restart your application

npm install nodemon -g

Once the installation is completed you can start your node process by this command:

nodemon app.js

That’s it!

Now, as soon you will do some changes in your app.js, it will restart the process for you.

Node-inspector

Another most important tool that you would like to use during your development is node-inspector. This tool will allow you to debug your node development.

Install this tool by this command:

npm install node-inspector -g

Then call this package with this command:

node-inspector app.js --debug

Another way to run your process in debug is:

node-debug app.js

Using Node-Inspector and Nodemon together:

You can use both packages together by using this command:

node-inspector & nodemon --debug app.js

 

Extension Methods in C#

Let say you want to extend a class that you cannot inherit. These classes might be defined as sealed or even a 3rd party DLL that you have just downloaded from nugget. How would you extend these classes?

Let us take an example of struct types in C# that are sealed by default, such as DateTime, String, Int32 etc..

You know that cannot extend DateTime as:

public struct CustomDateTime: DateTime    {    }
Error at compile time: Type CustomDateTime in interface list is not an interface

One option is to Wrap DateTime variable within CustomDateTime class and then provide your custom solutions. As an example:

public class CustomDateTime
 {
 private DateTime _dateTime;
public CustomDateTime()
 {
}
 public CustomDateTime(long ticks)
 {
 }
public CustomDateTime(long ticks, DateTimeKind kind)
 {
 }
public CustomDateTime(int year, int month, int day)
 {
 }
public CustomDateTime(int year, int month, int day, Calendar calendar)
 {
 }
public CustomDateTime(int year, int month, int day, int hour, int minute, int second)
 {
 }
public CustomDateTime(int year, int month, int day, int hour, int minute, int second, Calendar calendar)
 {
 }
public CustomDateTime(int year, int month, int day, int hour, int minute, int second, CustomDateTimeKind kind)
 {
 }
public CustomDateTime(int year, int month, int day, int hour, int minute, int second, int millisecond)
 {
 }
public CustomDateTime(int year, int month, int day, int hour, int minute, int second, int millisecond,
 Calendar calendar)
 {
 }
public CustomDateTime(int year, int month, int day, int hour, int minute, int second, int millisecond,
 CustomDateTimeKind kind)
 {
 }
public CustomDateTime(int year, int month, int day, int hour, int minute, int second, int millisecond,
 Calendar calendar, CustomDateTimeKind kind)
 {
 }
 }
 

As you can see here that you are actually reinventing a wheal.

You have an another, second, option to define static class such as:

public static class DateTimeUtility
 {
 public static string CustomFormat(DateTime date)
 {
 return "Your Date is: " + date.ToLongDateString();
 }
 }
public class Program
 {
 public static void Main()
 {
 DateTimeUtility.CustomFormat(new DateTime());
 }
 }

 

However, it is not user friendly because:

  • You have to pass variable
  • Not easy to read
  • You have to write more code. For e.g. DateTimeUtility.CustomFormat

 

Extension methods are not a necessity but it is an elegant way of writing code.

With the help of extension method the CustomFormat will become an instance method of DateTime.

The above code will look like as:

 

public static class DateTimeUtility    {
        public static string CustomFormat(this DateTime date)
        {
            return "Your Date is: " + date.ToLongDateString();
        }
    }

    public class Program
    {
        public static void Main()
        {
            var dateTime = new DateTime();
            dateTime.CustomFormat();
        }

Hence, you can write your method once with Template type and you will be able to use the same again and again.
An another example could be to implement “Between” check:

public static class ComparableExtensions
{
 public static bool WithRange<T>(this T actual, T lower, T upper) where T : IComparable<T>
 {
 return actual.CompareTo(lower) >= 0 && actual.CompareTo(upper) < 0;
 }
}
Var number = 5;
if (number. WithRange (3,7))
{
 // ....
}

 

How to make gulp-sass task working on Windows?

Step-by-step guide

  1. Go to: http://rubyinstaller.org/
  2. Click download and load Ruby 1.9.3.(http://dl.bintray.com/oneclick/rubyinstaller/rubyinstaller-1.9.3-p545.exe?direct)
  3. By default it will install on your c:\… drive (don’t change any default location or whatsoever and please make sure that you don’t have any space in the path.)
  4. Set your PATH variable that includes your ruby installation location. Now you should be able to run gem evn in your command prompt ( you probably better run command prompt under administrator)

    pic1

  5. From the command prompt run gem install sass

     

     If you find an error that complains about network drive not found or something like that. Then probably your home path is not set to your local drive. You need to set the HOME path so that it connect with your local drive. To check the value, run command SET HOME Now, if you find it is not correctly, or pointing to some network drive then set this as SET HOME = %UserProfile% 
    
    You can read more about this error at http://help.rubygems.org/discussions/problems/333-install-gems-on-windows-7

     

     

  6. Now you need to install bundler run gem install bundler 
    
    If you find this error on gem installation: 
    ERROR: While executing gem ... (Gem::RemoteFetcher::FetchError)
    SocketError: getaddrinfo: No such host is known. (http://rubygems.org/latest_specs.4.8.gz)
    
    Then probably you need to look at your proxy setting. If you don't want to enforce any proxy then run this command, which will empty any proxy settings: gem install sass -p 
    
    If you want gem to use any specific proxy then run this: gem install sass -p http://myproxy.com
After following this steps, you should be able to run Gulp task for SaSS from windows machine.

Custom Json Converter For Ember Data relation with WebApi2

Currently, I am writing WebAPI for Ember Front-end. There is a great requirement in Ember to send relationship of any object with their ID only.

class Address 
{
 public int id;
 public string location;
 public string postcode;
}
class Employee 
{
   public string Name {get;set;}
   public ICollection<Address> Addresses { get; set; }
}

 

The default behavior of Json.NET Serializer is it will send Employee record with collection of Address that has all fields i.e. location, postcode, id.

{
 "Name": "Amit"
 "Addresses" : 
 [
 {"id" : 1, "location" : "XX", "postcode" : "XX" },
 {"id" : 2, "location" : "XX", "postcode" : "XX" }
 ]
}

I want to modify JSON.NET so that when I am serializing a Model from my API it sends only an array of IDs for a composite Collection object.

i.e. The above serialization should look like:

{
"Name": 'Amit',
"Addresss" : [1,2]
}

You can get the result you want using a custom JsonConverter such as this:

/// <summary>
 /// Defines the custom JSON Converter of collection type that serialize collection to an array of ID for Ember.
 /// </summary>
 public class IDWriteListConverter : JsonConverter
 {
 /// <summary>
 /// Define the property name that is define in the collection type.
 /// </summary>
 private readonly string keyname = "ID";
/// <summary>
 /// It is write only convertor and it cannot be read back.
 /// </summary>
 public override bool CanRead
 {
 get { return false; }
 }
/// <summary>
 /// Validate that this conversion can be applied on IEnumerable type only.
 /// </summary>
 /// <param name="objectType">type of object</param>
 /// <returns>Validated value in boolean</returns>
 public override bool CanConvert(Type objectType)
 {
 return typeof(IEnumerable).IsAssignableFrom(objectType)
 && objectType != typeof(string);
 }
public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
 {
 throw new NotImplementedException();
 }
/// <summary>
 /// Write JSON data from IEnumerable to Array.
 /// </summary>
 /// <param name="writer">JSON Writer</param>
 /// <param name="value">Value of an object</param>
 /// <param name="serializer">JSON Serializer object</param>
 public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer)
 {
 JArray array = new JArray();
 foreach (object item in (IEnumerable)value)
 {
 PropertyInfo idProp = item.GetType().GetProperty(this.GetKeyName());
 if (idProp != null && idProp.CanRead)
 {
 array.Add(JToken.FromObject(idProp.GetValue(item, null)));
 }
 }
array.WriteTo(writer);
 }
public virtual string GetKeyName()
 {
 return keyname;
 }
 }

In your model, wherever you have a collection of something where you only want the IDs, decorate the collection property with a [JsonConverter] attribute specifying the custom converter. For example:

class Employee
{
 public string Name { get; set; }
[JsonConverter(typeof(IDWriteListConverter))]
 public ICollection<Address> Addresses { get; set; }
}

When the collection gets serialized, the converter will be used, and only the ID values will be written out. Demo:

class Program
{
 static void Main(string[] args)
 {
 Employee emp = new Employee
 {
 name = "Joe",
 Addresses = new List<Address>
 {
 new Address { id = 1, location = "foo", postcode = "bar" },
 new Address { id = 2, location = "baz", postcode = "quux" }
 }
 };
string json = JsonConvert.SerializeObject(emp);
Console.WriteLine(json);
}
}

Output:

{"name":"Joe","Addresses":[1,2]}

RESTFul Webapi Implementation Guidelines

REST-WebApi Implementation Guidelines

 

Introduction

As we are going into the era where all devices needs to be connected with the server there is a need to build an API that is easy to use and all devices speaks the same langugage.

What is REST?

Representational State Transfer. It relies on a stateless, client-server, cacheable communications protocol — and in virtually all cases, the HTTP protocol is used.

What is Hypermedia?

In layman term, when the REST response comes back then it should include some webapi resources as a reference for the client. This way the client can get to know some valid URI that can be called by the client.

  • Method of self-describing messages
  • Hypermedia as The Engine of Application State == HATEOAS
  • Hypermedia can create dynamic, self-documenting APIs
  • But you can be a great API without HATEOAS

rest

Web Services Vs REST

The concept of API is in development industry since COM, however, what has been changed now is how easy this API is available for different devices.

COM       Limitation: COM should be packaged along with the program. COM is accessible for only specific programming langugage, which means C++ dll is not directly accessible to Java.
 DCOM   Limitation: A proxy needs to be installed on local machine. Again a compatiblity issue, permission headache.
ASMX    Web service that is accessible through HTTP protocal and the client only needs to know how to call through a proxy that uses SOAP as a message protocol.
SVC Client still needs to have a knowledge to interpret SOAP.

Why Web service era is drying… Web service calls require a standard to be followed such as SOAP. Not all devices knows about these standards. This adds a complexity to provide a library at client level that can communicate in SOAP message format and web service can be connected. In case of REST WebApi it uses http. Thus, it has less overhead as compare to previous technology. Now what it means to your coding that your functions are available as is from Mobile to a Desktop. All you need is HTTP to call, which is available even in 20 $ mobile phones.

Should you use REST?
  • REST or REST-ful APIs are easy to consume and maintain
  • If your API does not fit the Resource Model, RPC or custom APIs are fine
  • Your new development should starts from REST principle and if does not fit then please proceed with the old procedure.

What will you learn here?

Idea behind this article  is to give you guidelines on what should be done for making a a good Web API. This document is listing some points that I have learnt and implemented (some :-)) for WebApi architecture. You can implement these rule in any programming language. Let us understand these rules.

Rules

1. » Don’t use verb as a part of the URL. So rather than saying that /todos/update it should be URL: /todos METHOD: POST or PUT.

The main reason of not using verb is because as the project extend there are too many URI and it is hard to main these URI. Thus, it makes it difficult to manage your API.

1.1» Use Identifiers to locate individual items in URIs. It does not have to be internal key.

e.g. (internal key)
http://api.yourcompany.com/customers/23

or (non internal key - however unique identifier to determine that single entity)

http://api.yourcompany.com/customers/283473-asdf923-asdf2

1.2» Use either Plural or Singular noun. However, don’t mix it.

Preferred Preferred Not Preferred
/polls/23 /poll/23 /polls/23
 /todos/2384  /todo/2384  /todo/2384

2. » Concrete is Good over Abstract; so instead of saying /items/23 it should says the type of items like /cars/23

3.» Don’t use query string (?) in the URL, where possible; Set Association resources/id/resources through url.

http://...api/Customers/1/orders
http://...api/customers/1/payments

4.» Use pre defined HTTP status code in your returned header of Api methods. refer available HTTP code at http://en.wikipedia.org/wiki/Http_error_codes

5.» You should also give an option to supress the code by giving an optional parameter which will force API to always return 200. suppress_response_codes. /customers/1? suppress_response_codes=true 

Response should be:

HTTP status code: 200 {"error":"Could not authenticate you."}

6.»  Make the version mandatory. Specify the version with a ‘v’ prefix. But keep in mind that API version is not product version. There are 4 types to implement Versioning:

  • Version as a part of URL – The main drawback is maintaining client on url change.If you are defining URL based version then move it all the way to the left in the URL so that it has the highest scope (e.g. /v1/customers).
  • Version as a part of Querystring – Same drawback as above.
  • Version as Content Type in Accept Header – The main drawback of this approach is it adds complexity implementing custom header on all platform. It can also encourage increased versioning which cause more code churning.e.g. Accept : vnd.api.v1.customer
  • Version as Custom Header – Same drawback as Content Type.e.g. x-api-version: 2 or  x-api-version: 10-07-2014

Whichever way you go, use a simple ordinal number; Don’t use the dot notation like v1.2; Before obsoleting previous versions give developers at least one cycle to react.

7.» Implement partial response; Partial response allows you to give developers just the information they need.

  • Implement Partial GET
  • Implement Partial Patch

8. » Make it easy for developers to paginate objects in a database; Use limit and offset to make it easy for developers to paginate objects.

9.» Accept different format and content negotiation is a Best practice. Use Accept header to determine which formats are allowed.

GET api/customers/1 HTTP/1.1
Accept: application/json, text/xml
Host: localhost...

10. »  Use JSON as a default return type, following Javascript camelCasing conventions for naming attributes.

e.g. {"firstName":"Joe","lastName":"Public"}

11.» Standard place to host is api.xxx.com and if the request is coming from the browser, then redirect it to developers.xxx.com. Also, redirect any request coming for dev, developer to developers.xxx.com

12.» On successful update, return 200 (or 204 if not returning any content in the body) from a PUT. If using PUT for create, return HTTP status 201 on successful creation.

13.» It is strongly recommended to use POST for non-idempotent requests. POST is neither safe or idempotent. It is therefore recommended for non-idempotent resource requests.

14.» PUT vs POST for Creation In short, favor using POST for resource creation. Otherwise, use PUT when the client is in charge of deciding which URI (via it’s resource name or ID) the new resource will have: if the client knows what the resulting URI (or resource ID) will be, use PUT at that URI. Otherwise, use POST when the server or service is in charge of deciding the URI for the newly-created resource. In other words, when the client doesn’t (or shouldn’t) know what the resulting URI will be before creation, use POST to create the new resource.

15.» Choose CORS whenever and wherever possible.

16.» Non-Resource API

  • Be sure that the user can tell it is a different type of operation
  • Be pragmatic and make sure that these parts of your API are documented
  • Don’t use them as an execute to build a RPC API
e.g.
http://...api/calcualateTax?state=GA&total=149.99
http://...api/restartServer?

17.» It is recommended to return an Entity Tag (ETag) header for each GET (read) operation. It should support Week Tag, starts with W/ and supports Strong Tag. if-None-Match header key is used for specifying the ETag value.

GET: /api/customers/1 HTTP /1.1
Accept: application/json, text/xml
Host: localhost
If-None-Match: 823748374783

Request response should return 304 status code.

18.» Protect your Api

  • SSL is almost always appropriate.
  • Secure the API itself using:
    • Cross Origin Security:
    1. JSON with Padding (JSON)  – not recommended coz the callback has to be maintained.
    2. Cross Domain Resource Sharing (CORS) –

19.» Authorization & Authentication 1. Authorization – Use API Keys 2. Authentication  – Options:

    • Website security such as Form, Windows Auth, 1st Party Developer (internal auth.)
    • Use OAuth 3rd party Developer (external auth.)

In the next series of post, I will explain & implement these rules with the help of .NET WEB API programming language.    stay tuned…. Connect with me: View Amit Malhotra's profile on LinkedIn.

Tips #2 Problem connecting with nuget.

Problem:

For some reason I could not connect with the nuet from my visual studio. I was getting an error from package installer console, so the manage console for this project IDE.

Identification:

It came into light that there is a proxy issue because in my organisation the proxy is configured in IE settings.

Hence I have to set the same settings for devenv configuration too.

Solution:

1. Open devenv.exe.config of your Visual Studio configuration; in my case it is Visual Studio 2013 and it is located at: C:Program Files (x86)Microsoft Visual Studio 12.0Common7IDE

2. Now open this text file, you should be running an editor program as admin, and add the node „<defaultProxy>“ inside the node „<system.net>“.

<system.net>
 <defaultProxy useDefaultCredentials=“true“ enabled=“true“>
 <proxy bypassonlocal=”true” proxyaddress=”http://yourproxyaddres...” />
 </defaultProxy>
</system.net>

In the attribute “proxyaddress” you can specify the address and port of your proxy server.

 

Tips #1 camelCase Json from webApi

Problem:

I have came into the situation where I need to build webapi that returns value in camel case so that it is defined as per javascript writing notation standard.

 

Solution:

I have two options that either I define json property for each POCO ( Plain old CLR object).

 [JsonObject("todoItem")]
 public class TodoItem
 {

[JsonProperty("id")]
public int ID { get; set; }
 [JsonProperty("title")]
 public string Title { get; set; }

 [JsonProperty("isCompleted")]
 public bool Completed { get; set; }
 }

This way Json seialization woudl know how it should represent the json result.

The another solution by using Newtonsoft.Json.Serialization.CamelCasePropertyNamesContractResolver class.

Use this below snippet to configure the serialization format.

config.Formatters.JsonFormatter.SerializerSettings.ContractResolver = new CamelCasePropertyNamesContractResolver();

I have wrote this line at WebApiConfig.cs file for WEB API project:

public static void Register(HttpConfiguration config)
 {
// configure serialize settings so that the response comes as camel case serialization.
 config.Formatters.JsonFormatter.SerializerSettings.ContractResolver = new CamelCasePropertyNamesContractResolver();
 }

An introduction to Gulp Build System

What is Gulp?

Just like Grunt, which is a build tool for front-end web development, Gulp is a new block that runs predefined tasks on static files such as CSS, HTML, Javascript, images. Some of these tasks includes:

  • minimise files size
  • watch file changes and automate the deployment

What is the major difference between Grunt and Gulp?

  • Grunt plug-ins often perform multiple tasks where as Gulp plug-ins are designed to do one thing only; it is clean and neat.
  • Grunt requires plug-ins for basic functionality such as file watching; Gulp has them built-in.
  • Grunt uses JSON-like data configuration files; Gulp uses leaner, simpler JavaScript code. You don’t create multiple configuration files in Gulp for each task like in Grunt.

How to use Gulp?

I have recorded a video about that explains you about:

  1. How to Install Gulp
  2. How to Install Gulp Plugins
  3. How to find Plugins
  4. How to define Gulp tasks
  5. How to execute bunch of tasks
  6. How to watch changes in static resources and execute tasks automatically.

 

 

 

 

References

  • http://www.smashingmagazine.com/2014/06/11/building-with-gulp/

 

TFS || !TFS

Introduction

Version control is probably the single most important piece of the development environment after the IDE itself, essential for teams and I’d argue extremely valuable even for a solo developer. So, if you’re about to start a new project it’s a good time to consider what version control solution you’re going to use.

Before we go into our friendly debate, let us think a list of items that is important to the Business.

What Business needs?

CHECK LIST – Business Requirement for Source Control
Source Code repository is backed up with minimal efforts
Source Code repository is available inhouse
Only authorised person can view and access Source Control
Easy to recover code at any point in time.
If one person is not available then it should not make an impact to others’ productivity
Branching and Merging is allowed.
(Probably most important) What developers wants….

What Developer needs?

Check List
Branching and merging should be as easy it can
Integration with my other developer tools such as Visual Studio, Eclipse etc…
A way to shelve (TAG) the code into the repository for backup or sharing a code purpose
Muliple users can work on same file
Easy to manage any conflicts or even rollback.
Compare any changes and recover code anytime.

Knowing TFS and Gits.

Well, I like Team Foundation Server and I like Git, but which is better?

TFS

Team Foundation Version Control (TFVC) uses a single, centralized server repository to track and version files. Local changes are always checked in to the central server where other developers can get the latest changes.

Git is a Distributed Version Control System (DVCS) that uses a local repository to track and version files. Changes are shared with other developers by pushing and pulling changes through aremote, shared repository.

NOTE: With the help of TFS 2012 or above we can support local as well as centralized model.

It is quite clear that you can go both ways and Gits has no reason to win over TFS except that it supports Open source community.
You can get Visual Studio Online free of cost for 5 developers:
http://www.visualstudio.com/en-us/products/visual-studio-online-overview-vs.aspx

Even though Microsoft TFS has started supported DVCS model, the support is very limited. You can only use it for some purposes such as modifying file offline.

TFS with TFVC, with either model (Edit Commit with local workspaces, Check-in Check-out with server workspaces), is still centralised, so a developer working locally without a connection to the server has some limitations such as not being able to create branches, merge changes, view history or perform compares without the server connection. In other words, TFVC is not a DVCS, and if that’s what you need then you’ll probably be looking at the most popular DVCS tool, Git.

Gits

Git arose from the Linux kernel development community (including and in particular Linus Torvalds) in 2005 and has become the leading DVCS tool. It has a totally different model to centralised version control systems like TFVC by keeping a full repository locally on the developer’s machine. The repository is essentially a file based system with metadata in hidden files. The developer can then perform any task offline, including checking out, editing, committing, viewing and diffing on the full history, creating and destroying branches, merging and more.

At appropriate points the developer’s local repository can be synchronised with a remote repository either by fetching, pulling or pushing changes. This, of course, does require a connection to the remote repository.

Git is popular with Open Source teams, who are typically highly distributed, and suits modular codebases where individuals can work entirely locally for extended periods.

Most of the Windows Git tools are command line based, but Visual Studio also has Git integration, either via the Visual Studio Gallery or in 2013 it’s out of the box.

Having added this solution to a Git repository there are commands available within Visual Studio to work with Git, in this case right-clicking on the solution to commit changes locally to the Git repository.

TFS with Gits.

TFS 2013 now includes Git repository support out of the box. When you create a new Team Project there is a new page in the wizard asking you to specify whether you want TFVC or Git for the version control aspect of the Team Project.

 

image

I won’t go into the step-by-step how to for using Git in TFS, as this is covered very well elsewhere, amongst others, but the Git features and the TFS features are now integrated. For example, when I commit local changes to Git in TFS, I can also associate work items (such as a user story or task)

 

This means that I can continue to use the capabilities of TFS (Agile tools, work items, build and so on), whilst selecting Git as the version control solution within TFS, and having the two areas integrated.

 

Is this a special “Microsoft” version of Git? No, it’s absolutely standard, meaning that you can use the tools with Visual Studio and/or your own preferred command line or other Git tools, as you wish.

 

Summary

 

I hope that this clarifies the new options around version control within TFS, and that where you might have been having a discussion about DVCS vs Centralised, or Git vs TFS, that new possibilities have been opened up.

If team decided to go for Gits then probably it is good to go with TFS and Gits combination. Then we probably don’t need to go for any other agile modeling tools.

Infect, we can probably use Team rooms feature of Visual Studio 2013.

Foxit PDF adapted by Google Chrome

Google has announced to use FoxIT PDF libraries through which you can manipulate PDF operation at browser level.

Indeed, it will further push us to develop software that are specific to Google platform. However, it will be good to see how others, like IE Firefox, will cop up with challenges that will push them to extend support of such features into their browsers as well.

I have been playing around with FoxIT. Overall I could see it is a nice & clean sdk that works in different platform. It can also work with encrypted form.

For .net developers there is a piece of samples available directly from the link:

http://cdn01.foxitsoftware.com/pub/foxit/sdk/net/1.x/1.0/FoxitPDFSDKForNET102_enu.zip

The main problem that I find with such type of SDK is that you have to create even a table through coding. If you want to attach chart, graphs in your pdf then things can go too much of writing. That is the main reason that I am no convinced to for for this product or develop something that is specific to the browser.

Now what could be the solution?

The solution that I recommend to my client is to go through RDLC, which is free (technically not) and it is a Microsoft Product, which is easy to design as well.

Here is the sample code that I can guess solve the problem of generating PDF from RDCL.

Warning[] warnings;

string[] streamIds;
string mimeType = string.Empty;
string encoding = string.Empty;
string extension = string.Empty;
DataSet dsGrpSum, dsActPlan, dsProfitDetails,
dsProfitSum, dsSumHeader, dsDetailsHeader, dsBudCom = null;

//This is optional if you have parameter then you can add parameters as much as you want
ReportParameter[] param = new ReportParameter[5];
param[0] = new ReportParameter("Report_Parameter_0", "1st Para", true);
param[1] = new ReportParameter("Report_Parameter_1", "2nd Para", true);
param[2] = new ReportParameter("Report_Parameter_2", "3rd Para", true);
param[3] = new ReportParameter("Report_Parameter_3", "4th Para", true);
param[4] = new ReportParameter("Report_Parameter_4", "5th Para");


DataSet dsData= "Fill this dataset with your data";
ReportDataSource rdsAct = new ReportDataSource("RptActDataSet_usp_GroupAccntDetails", dsActPlan.Tables[0]);
ReportViewer viewer = new ReportViewer();
viewer.LocalReport.Refresh();
viewer.LocalReport.ReportPath = "Reports/AcctPlan.rdlc"; //This is your rdlc name.
viewer.LocalReport.SetParameters(param);
viewer.LocalReport.DataSources.Add(rdsAct); // Add datasource here

// NOTE: You can also pass first parameter as :Excel.
byte[] bytes = viewer.LocalReport.Render("PDF", null, out mimeType, out encoding, out extension, out streamIds, out warnings);
// byte[] bytes = viewer.LocalReport.Render("Excel", null, out mimeType, out encoding, out extension, out streamIds, out warnings);

Response.Buffer = true;
Response.Clear();
Response.ContentType = mimeType;
Response.AddHeader("content-disposition", "attachment; filename= filename" + "." + extension);
Response.OutputStream.Write(bytes, 0, bytes.Length); // create the file
Response.Flush(); // send it to the client to download
Response.End();

In conclusion, I will give it a go with generating PDF through SDK if I don’t have an access to RDLC technology. If you want to manipulate generated PDF then you might want to see PDF SDK as well. Otherwise, for simple operation such as generating PDF, my recommendation is for RDLC.