Search This Blog

Tuesday, January 16, 2018

ASP.NET Core 2.0 - Logging with Log4Net (Microsoft.Extensions.Logging.Log4Net.AspNetCore)


Below are steps in integrating Log4Net in Asp.Net Core.

1) First install required package:
    dotnet add package Microsoft.Extensions.Logging.Log4Net.AspNetCore
2) Create file log4net.config with following xml content

    <?xml version="1.0" encoding="utf-8" ?>
    <log4net>
      <appender name="RollingFileAppender" type="log4net.Appender.RollingFileAppender">
        <file type="log4net.Util.PatternString" value="Logs/Logs_%date{yyyyMMdd}.txt" />
        <rollingStyle value="Date" />
        <appendToFile value="true" />
        <rollingStyle value="Size" />
        <datePattern value="yyyyMMdd" />
        <maxSizeRollBackups value="10" />
        <maximumFileSize value="10000KB" />
        <staticLogFileName value="true" />
        <layout type="log4net.Layout.PatternLayout">
          <conversionPattern value="%-5level %date{dd-mm-yyyy HH:mm:ss} %logger [%thread] - %message %exception%newline" />
        </layout>
      </appender>
      <root>
        <appender-ref ref="RollingFileAppender" />
        <level value="All" />
      </root>
    </log4net>
3) Inside Appsetting.js add section to save log4net config path ,section is added on same level as "Logging"

  "Log4NetConfigFile": {
    "Name": "Services/LoggingService/log4net.config"
  }

    My log4net.config is inside services/LoggingService you can add it in project root if so section will need tiny modification in path as below.

    "Log4NetConfigFile": {
        "Name": "log4net.config"
      }
     

3) inside Configure method in StartUp.cs add

        loggerFactory.AddConsole(Configuration.GetSection("Logging"));
            loggerFactory.AddDebug();
            loggerFactory.AddLog4Net(Configuration.GetValue<string>("Log4NetConfigFile:Name"));

4) Inside Your Test controller add class variable
    private readonly ILogger Logger;

   and inject its dependency through constructor

public TestController(ILoggerFactory DepLoggerFactory, IHostingEnvironment DepHostingEnvironment)
        {
            HostingEnvironment = DepHostingEnvironment;
            Logger = DepLoggerFactory.CreateLogger("Controllers.TestController");
        }

5) Now Create Test Action


     public IActionResult  GetAllMyModels(string id)
        {
            var comments = (from m in Db.MyModelChild
                            where m.ParentId.ToString() == id
                            select m).ToList();

            Logger.LogInformation("Test log information 001");
            if (comments.Count() == 0)
            {
                Logger.LogInformation("Test log information 002");
                return NotFound();
            }
            return new ObjectResult(comments);
        }

      on success this method emit MyModelChild objects as json

6) Now check to Logs folder in root redirectory its created with pattern of YYYYMMDD prepended with "Logs_"
    Logs_20180116.txt.Logged text also include timestamp.
   
    Output:
        INFO  16-13-2018 11:13:24 Controllers.TestController [7] - Test log information 001

        In my case i am getting data so above log text will be there along with other output from dotnet run which falls in info.
   
     we can write other kind of log than warning too.

Saturday, January 6, 2018

Serializing & deserializing an object to & from JSON


Dot net framework 3.5 has native support for serializing an object into JSON string.to illustrate this we will create a console application,in this console application we will first create a simple entity class called device with properties devicename,brandname,price,resolution & devicecode.
  Now in our console's main we will create a LIST based collection from our Device class.
 To serialise this collection object we are using functionality from system.web.script namespace which requires adding reference to corresponding dll.
  We created two static function one or purpose of serializing namely SerializeJsonToString & other for purpose of deserializing namely DeSerializeJsonStringToObj.
  For serializing we need to create an object of JavaScriptSerializer class which has method Serialize() which will take an object and convert it into equivalent JSON string.
   For deserialization we will use  Deserialize() method to which we first need which type of object we are deserializing in our case it is List<Device>  which is LIST based collection of Device objects.

using System;

using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.Web.Script.Serialization;



namespace JsonSerialization
{
    class Program
    {
        static void Main(string[] args)
        {
            Device Note3 = new Device { DeviceName = "Samsung Galaxy Note III", BrandName = "Samsung", Price = "43,787", Resolution = "1024X768", DeviceCode = "GTN623" };
            Device Note2 = new Device { DeviceName = "Samsung Galaxy Note II", BrandName = "Samsung", Price = "33,250", Resolution = "800X600", DeviceCode = "GTN523" };
            Device Mega = new Device { DeviceName = "Samsung Galaxy Mega", BrandName = "Samsung", Price = "23,787", Resolution = "600X425" ,DeviceCode="GTN123"};
            Device Lumia = new Device { DeviceName = "Nokia Lumia 520", BrandName = "Nokia", Price = "20,787", Resolution = "600X425" };



            List<JsonSerialization.Device> DeviceList = new List<JsonSerialization.Device>();
            DeviceList.AddRange(new Device[] { Note3, Note2, Mega, Lumia });



            string SerializedString = SerializeJsonToString(DeviceList);



            List<Device> DeserialzedDeviceList = new List<Device>();
            DeserialzedDeviceList = DeSerializeJsonStringToObj(SerializedString);
      
            Console.ReadKey();
        }



        public static string SerializeJsonToString(List<JsonSerialization.Device> obj2Serialize)
        {
            JavaScriptSerializer oSerializer = new JavaScriptSerializer();
            string sJSON = oSerializer.Serialize(obj2Serialize);
            return sJSON;
        }
        public static List<Device> DeSerializeJsonStringToObj(string jsonString)
        {
            JavaScriptSerializer oSerializer = new JavaScriptSerializer();
            List<Device> DeserialzedDeviceList  = oSerializer.Deserialize<List<Device>>(jsonString);
            return DeserialzedDeviceList;
        }
    }



    //entity class
    public class Device
    {
        public string DeviceName 
        { 
            get; set; 
        }
        public string BrandName
        {
            get; set; 
        }
        public string DeviceCode 
        { 
            get; set; 
        }
        public string Price
        {
            get;
            set;
        }



        public string Resolution
        {
            get;
            set;
        }
    }
}

Postgres Cross-Server/Cross-Database Querying


To demonstrate cross database quering we will create two databases "PlaygroundLive" & "PlaygroundStaging".Both these database can be on same server or different postgresql server.In current case they are on same server.

On both databases create below table

create table TestTable
(
  id int primary key,
  name varchar(500)
)

on staging add three rows

insert into TestTable(id,name) values(1,'sagar'),(2,'sangram'),(3,'sachin');

on live add two rows

insert into TestTable(id,name) values(1,'sagar'),(2,'sangram');

Now create extensions called dblink & foreign data wrapper

CREATE EXTENSION postgres_fdw;

CREATE EXTENSION dblink;

Check if extension got created or not by checking newly added procedures/functions related to new extension db_link.

SELECT pg_namespace.nspname, pg_proc.proname FROM pg_proc, pg_namespace WHERE pg_proc.pronamespace=pg_namespace.oid AND pg_proc.proname LIKE '%dblink%';

On our staging Database connect to live database,check if connectivity succeed or not,query below give OK on success

SELECT dblink_connect('host=localhost user=postgres password=sangram dbname=PlaygroundLive');

CREATE FOREIGN DATA WRAPPER dbrnd VALIDATOR postgresql_fdw_validator;

Creating server connection:
Here we have both database on same postgres sql server yet we have to follow same process we need to provide server details & server IP & database to connect for.
CREATE SERVER demodbrnd FOREIGN DATA WRAPPER postgres_fdw OPTIONS (hostaddr '127.0.0.1', dbname 'PlaygroundLive');

you can check if server creation succeeded by running below command

SELECT * FROM pg_foreign_server; 
 
CREATE USER MAPPING FOR postgres SERVER demodbrnd OPTIONS (user 'postgres',
password 'sangram');
 
here postgres user on local server is mapped to postgres user on remote server

Connect to Server:

SELECT dblink_connect('demodbrnd');

GRANT USAGE ON FOREIGN SERVER demodbrnd TO postgres;

Running Cross Database Queries:
now from staging database we can run queries to live database


1) Select:
SELECT * FROM public.dblink('demodbrnd','select id,name from public.TestTable') AS DATA(id INTEGER,name CHARACTER VARYING);

it will have two records in output.

2) INSERT:
SELECT dblink_connect('demodbrnd');
select * from dblink_exec('INSERT INTO public.TestTable values(3,''sachin'')')

One more record will get added to TestTable on Live database.Now TestTable on both database are identical.

3) SELECT INTO:
SELECT * INTO temp_TestTable
FROM public.dblink('demodbrnd','select id,name from public.TestTable') 
AS DATA(id INTEGER,name CHARACTER VARYING);

Here temp_TestTable is created on Staging database will get all three 
records from TestTable in Live database.

4) Parametrized SELECT:
--using parametrized query with prepare
PREPARE migrate_data (integer) AS
INSERT INTO temp_TestTable
SELECT id, name
FROM dblink('demodbrnd', 'select id,name from public.TestTable')
AS t(id integer, name varchar)
WHERE id > $1;

EXECUTE migrate_data(2);

execution of prepare statement will add one more record to temp_
TestTable so there will 4 records now.

--check one more record got inserted
select * from temp_TestTable

TO delete Foreign data wrapper follow following sequence of drop 
queries as object stacked one above other.
Create Schema called “ForeignSchema” in staging database

Now import all tables from remote database public schema into staging 
atabase “ForeignSchema” as follows.

IMPORT FOREIGN SCHEMA public
FROM SERVER demodbrnd INTO "ForeignSchema";

Now remove 1st record from staging database table so that we are sure that 
both table differ as follows

delete from public.TestTable where id=1;

you can check that he our “TestTable” from live server has been imported 
into local database in schmema “ ForeignSchema” by running command 

select * from "ForeignSchema".TestTable;

Now we can run insert into statement also:
insert into "ForeignSchema".TestTable
select id+3,name from public.TestTable;

DROPPING Foreign Data WRAPPER: 
drop USER MAPPING FOR postgres SERVER demodbrnd ;
drop SERVER demodbrnd;
drop FOREIGN DATA WRAPPER dbrnd;

How to find IP address


To find Ip address of our Fedora Machine  we can use ifconfig command

#ifconfig -a

Output:

docker0: flags=4099<UP,BROADCAST,MULTICAST>  mtu 1500
        inet 172.17.0.1  netmask 255.255.0.0  broadcast 0.0.0.0
        ether 02:42:cd:a5:81:90  txqueuelen 0  (Ethernet)
        RX packets 0  bytes 0 (0.0 B)
        RX errors 0  dropped 0  overruns 0  frame 0
        TX packets 0  bytes 0 (0.0 B)
        TX errors 0  dropped 0 overruns 0  carrier 0  collisions 0

enp2s0: flags=4163<UP,BROADCAST,RUNNING,MULTICAST>  mtu 1500
        inet 192.168.0.103  netmask 255.255.255.0  broadcast 192.168.0.255
        inet6 fe80::52e5:49ff:fe9a:81b1  prefixlen 64  scopeid 0x20<link>
        ether 50:e5:49:9a:81:b1  txqueuelen 1000  (Ethernet)
        RX packets 93288  bytes 117692336 (112.2 MiB)
        RX errors 0  dropped 0  overruns 0  frame 0
        TX packets 71001  bytes 6798224 (6.4 MiB)
        TX errors 0  dropped 0 overruns 0  carrier 0  collisions 0

enp3s6: flags=4099<UP,BROADCAST,MULTICAST>  mtu 1500
        ether 00:e0:4c:49:e3:31  txqueuelen 1000  (Ethernet)
        RX packets 0  bytes 0 (0.0 B)
        RX errors 0  dropped 0  overruns 0  frame 0
        TX packets 0  bytes 0 (0.0 B)
        TX errors 0  dropped 0 overruns 0  carrier 0  collisions 0

lo: flags=73<UP,LOOPBACK,RUNNING>  mtu 65536
        inet 127.0.0.1  netmask 255.0.0.0
        inet6 ::1  prefixlen 128  scopeid 0x10<host>
        loop  txqueuelen 1  (Local Loopback)
        RX packets 890  bytes 388569 (379.4 KiB)
        RX errors 0  dropped 0  overruns 0  frame 0
        TX packets 890  bytes 388569 (379.4 KiB)
        TX errors 0  dropped 0 overruns 0  carrier 0  collisions 0

virbr0: flags=4099<UP,BROADCAST,MULTICAST>  mtu 1500
        inet 192.168.124.1  netmask 255.255.255.0  broadcast 192.168.124.255
        ether 00:00:00:00:00:00  txqueuelen 1000  (Ethernet)
        RX packets 0  bytes 0 (0.0 B)
        RX errors 0  dropped 0  overruns 0  frame 0
        TX packets 0  bytes 0 (0.0 B)
        TX errors 0  dropped 0 overruns 0  carrier 0  collisions 0

virbr0-nic: flags=4098<BROADCAST,MULTICAST>  mtu 1500
        ether 52:54:00:a9:f6:1d  txqueuelen 1000  (Ethernet)
        RX packets 0  bytes 0 (0.0 B)
        RX errors 0  dropped 0  overruns 0  frame 0
        TX packets 0  bytes 0 (0.0 B)
        TX errors 0  dropped 0 overruns 0  carrier 0  collisions 0

what is enp2s0 when usually its eth0,eth1 so let's google it.  
Here is what they say about  "enp2s0".
    en    stand for  ethernet
    p2    stand for  bus number (2)
    s0    stand for slot number (0)


 In this enp2s0 & enp3s6 stand for wired connection  our machine is connected to internet on wired connection so within this output lets  look for term "inet IpAddress" we got "192.168.0.103"

Notes on Integrating Postgres Entity Framework with DOT NET Core 2.0

First create a Core 2.0 app say dotnetmvcapp using

dot net new dotnetmvcapp

now we will follow online tutorial

https://www.youtube.com/watch?v=md20lQut9EE

I am just listing my files & its final content & Few important points.

dotnetmvcapp.csproj


<Project Sdk="Microsoft.NET.Sdk.Web">

<PropertyGroup>
<TargetFramework>netcoreapp2.0</TargetFramework>
</PropertyGroup>

<ItemGroup>
<PackageReference Include="Microsoft.AspNetCore.All" Version="2.0.0" />
<PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSql" Version="2.0.0" />
<PackageReference Include="Microsoft.EntityFrameworkCore.Tools" Version="2.0.0" />
<PackageReference Include="Microsoft.EntityFrameworkCore.Design" Version="2.0.0" />
</ItemGroup>

<ItemGroup>
<DotNetCliToolReference Include="Microsoft.VisualStudio.Web.CodeGeneration.Tools" Version="2.0.0" />
<DotNetCliToolReference Include="Microsoft.EntityFrameworkCore.Tools.DotNet" Version="2.0.0" />
</ItemGroup>

</Project>


run restore packages

dotnet restore


Startup.cs
add below using directives first

public void ConfigureServices(IServiceCollection services)
{
services.AddMvc();
services.AddEntityFrameworkNpgsql()
.AddDbContext<dotnetmvcappContext>(opt
=> opt.UseNpgsql(Configuration.GetConnectionString("MyWebAppConnection")));
}
Here You Need to add Below using get read of error message at “ UseNpgsql” &
dotnetmvcappContext”.

using dotnetmvcapp.Models;
using Microsoft.EntityFrameworkCore;




Create Model:

a) Create Models folder

b) Inside Models folder add

dotnetmvcappContext.cs

Users.cs


dotnetmvcappContext.cs

using Microsoft.EntityFrameworkCore;

namespace dotnetmvcapp.Models
{
public class dotnetmvcappContext : DbContext
{
public dotnetmvcappContext(DbContextOptions<dotnetmvcappContext> options) : base(options)
{

}
public DbSet<User> Users {get;set;}
}
}


Users.cs
namespace dotnetmvcapp.Models
{
public class User
{
public int Id{get;set;}
public string Name{get;set;}

public string Email{get;set;}
}
}

appsettings.json

{
"ConnectionStrings":{
"MyWebAppConnection":"User ID= xdba;Password=sangram;Server=localhost;Port=5432;Database=xplay;Integrated Security=true;Pooling = true"
},
"Logging": {
"IncludeScopes": false,
"LogLevel": {
"Default": "Warning"
}
}
}

Run Migration:
dotnet ef migrations add InitialMigration

It initializes our 'dotnetmvcappContext'

this Migration can be undone using
ef migrations remove

Apply Migration to Db:

dotnet ef database update

this will create a Users Table inside postgres which conforms to


CREATE TABLE public."Users"
(
"Id" integer NOT NULL DEFAULT nextval('"Users_Id_seq"'::regclass),
"Email" text COLLATE pg_catalog."default",
"Name" text COLLATE pg_catalog."default",
CONSTRAINT "PK_Users" PRIMARY KEY ("Id")
)

you can check postgre sql and confirm the User Table has been created.


Ubuntu -Resolving gitlab Permission denied (publickey) on push

    1) Once you created gitlab account ,create a repository say "abc" on gitlab, with "code_dir" as folder to push to gitlab
    cd code_dir
    git init
    git remote add origin git@gitlab.com:your.name/abc.git
    git add .
    git commit -m "Initial commit"
    git push -u origin master

      at time of final push i got greeted with "gitlab Permission denied (publickey)" error,after to resolve it we need to create public-private key that identfies our machine & add it to gitab
      inside settings ,ssh key section.

    2) on terminal run
     ssh-keygen
     output:
        Generating public/private rsa key pair.
        Enter file in which to save the key (/home/sangram/.ssh/id_rsa):
        Enter passphrase (empty for no passphrase):
        Enter same passphrase again:
        Your identification has been saved in /home/username/.ssh/id_rsa.
        Your public key has been saved in /home/username/.ssh/id_rsa.pub.
        The key fingerprint is:
        SHA256:xwoAYiFpCTSM9QjvNfPfkh/NQAKhktRQ2zh6+tYo1u4 sangram@sangram-HP-Laptop-15-bs0xx
        The key's randomart image is:
        +---[RSA 2048]----+
        |XBB+. o.         |
        |=O.=.= .         |
        |. = @ . . .      |
        | . + *   +       |
        |  o . o S +      |
        |   o   o = +     |
        |  .. o  = o o    |
        |  o.+ .  o .     |
        | . =E     .      |
        +----[SHA256]-----+

    2) Cat /home/username/.ssh/id_rsa.pub
           copy all output to clipboard

    3) login to gitlab & visit
        settings-->SSH Keys
    in textbox for key on left paste key text & hit save.
    4) Now attempt push
    git push -u origin master