Search This Blog

Monday, October 28, 2024

Kubernetes cheat sheet

Below major cmdline cmds keep it handy to play in your playground easily..

kubectl get pods  ( listing of pods )
get services 
get deployments
get nodes 
kubectl get pods -o wide -A  ( all pods in all name spaces)
describe pod <podname>
describe service <sernamr>
apply -f <yaml>
delete pod <podname>
delete -f <yamlfilename>
logs <podname>
logs <podname> -c container name
-exec podname --shellname (to login)
scale deployement depname --replicas=n
expose deployment 
port-forward podname port1:port2
rollout status deployment/depname
config get-context
config use-context contxtname
config view 




Monday, October 21, 2024

Kubernetes Networks a Nightmare

Heartbeat of Kuberntes is its network by which pods, nodes, services work, you see lot of hickups in configuring the infra and network from scratch.I suggest to for readymade networking models and service meshes avilable in market, typically enterprise versions.
If you are familiar with basics networking ips in Kuberntes, It has
- Node IP for pod to pod communication between the nodes
- PodIP - a single ip of each pod,
 -Virtaual ip to represent a group of pods is your Service IP

I am briefying here some of the best practices you can consider while setting up your ip.
1.Use mesh like Linkerd
2.Use DNS to resolve your ips of nodes and pods
3.Better to allocate node,pod and service ips in different vpc/networks.

You can connect with me if any assistance required in setting up the Kubernetes cluster, taking up services, deployments and in configuring service meshes.

Thank you, happy reading.

Saturday, September 7, 2019

Docker - Spin up Continer then Site

This post is very useful for those who are familiar with Docker.Will also help them who are at their beginner stage in learning docker.

As you all know,Docker is the best choice when we try to test/manage some applications in various versions or environments.

Here I am giving an example of Spinning up a NGINX container to bring a java HelloWorld application up.

In my next post, I will give you the internals of Docker Networking.

Prerequisites for this is to have Docker and Java installed in your environment.

- Create Hello World java code

public  class HelloJeevanWorld
{
  public static void main(String[] args)
{
    System.out.println("Hello Jeevan World");
  }
}
 
- Build it
- create manifest file to create HelloJeevanWorld.java
- Create a DockerFile with below content

---------------------------------------------------
FROM java:8
CMD mkdir -p /tmp/Jeevan/ 
WORKDIR /tmp/Jeevan
COPY HelloJeevanWorld.java /tmp/Jeevan/
EXPOSE 8080
CMD java -jar HelloJeevanWorld.java
 
------------------------------------------------- 
 
 
Here from attribute pulls mentioned image from docker hub/registry
CMD runs as a unix command to create a directory
WORKDIR denotes the docker working folder
COPY makes copying of java artifact from docker host to image so that it would be
avilable to container
EXPOSE exposes container to use it on port 8080.You can do it while running docker
 run even as 8080:8080 (in format external port : container posrt) 
CMD in last line to run java file


-Build docker java image using this DockerFile by below which created a image which
 you can see by executing docker images
docker build -t JeevanHelloWorld .
-Spin up  container using 
docker run JeevanHelloWorld

-You can access this using http://dockerhostip:8080/ from browser


   In the next posts I will discuss about Docker Port forwarding, using nginx reverse 
proxy for docker hosted applications, creating doc networks and spinning up containers within the range of specific ip address.
 
 
Thanks for visiting.
Do Share My Blog -> http://TheUnixBlogOne.blogspot.in 
 
 

Monday, July 18, 2016

export.. Always a confusion to developer

I bet, more than fifty percent of unix users confuse to use export even they encounter it in many scripts however.

This small example clears all your exported confusions.

To give a definition to export as usual:

It is a unix built-in command to mark a variable to export to child process(s).


Example (No export):

>>myvariable=10
>>echo $myvariable
10
>>cat export_test_script

echo $myvariable

>>
>>./export_test_script
>>


Example (With export):

>>export myvariable=10  (use export myvariable if myvariable  is already defined)
>>echo $myvariable
10
>>cat export_test_script

echo $myvariable

>>/export_test_script
10
>>




Friday, July 10, 2015

To avoid overridden of /etc/resolve.conf (Ubuntun 14.04) ..

While setting up the dns for the naming server, I came across this situation where /etc/resolv.conf file  is not getting updated.

To avoid this ..
-add name server details in base file
-then update resolve conf
-restart network mgr ( -> service network-manager stop and  .. start)



Saturday, June 6, 2015

Socket Configuration With The Client System ..

********************************************

06-June-2015


Requirements beore going to this article:
i)Basic idea of socket and its use
ii)Usage of Perl modules

********************************************

Generally Socket configuration consists of..

Creating Socket
Bind it to port
Client Connection Establishment

##Related  library is Socket INET (i.e. Socket::INET)

Below is the skeleton code..

where XXX needs values with repect to local configuration.


socket = new IO::Socket::INET (
LocalHost => 'XXX',
LocalPort => 'XXXX',
Proto => 'tcp',
Listen => 5, #5 for tcp, varies based on configuration
Reuse => 1
) or die "error in creation of scket: $!  ”;


## repeat a infinite loop till the connection is success full something like 

while(1)
{

$client_soc = $socket->accept();

#to fetch host
$peer_address = $client_sock->peerhost();
#to fetch port num
$peer_port = $client_sock->peerport();

#this is end of the connection

$data = “data 4m Server”;
print $client_sock “$data\n”;

$data = <$client_socket>;

print “ $data is received from clinet s\n”;
}
#finally end of the connection by 
$socket->close();

TCP Socketing

Generally socket operation contains..


Friday, January 9, 2015

Putty Auto Login Through Batch



Unix Terminal can easily be accessed by a batch file instead of doing it by putty(tradition repeated way ).


Use below in a batch file

start <putty.exe path> <unixhostname> -l username -pw password

Example:

start C:\putty.exe jmksrv.uk.mm -l uname1 -pw pass@12345


Monday, July 14, 2014

Sql Loader Data Load in Unix



When we deal with unix oracle environment, importing a file to db table by sql loader is always effective.

Basic flow is ...


read File->sqlldr takes reference of cntrl file->table description check->validate->load line by line

Below is the syntax to work with...

sqlldr username/password control=ctrl_file_name log=logfile

In case default credentials are set through cfg file, username and password can be ignored.

sample cntrl file

OPTIONS (DIRECT=TRUE)
         LOAD DATA
         INFILE sample.log
         --
         INSERT
         INTO TABLE sample
         --
         (sample_col1
         ,sample_col2
,sample_col3
         )


Thursday, April 24, 2014

Random Numbers

Random numbers can be produced like most of the other programming languages in unix with the key word RANDOM .

$echo $RANDOM
29003
$echo $RANDOM
4613

PS: Required number of digits can then be produced by doing module (%).

$echo $((${RANDOM} % 120}
4

Thursday, March 27, 2014

Debugging Child and Source Unix Scripts

Debugging The Original:


Script gets executed in debug mode if we specify sh -x while calling.


 1)By sh -x


Say script.sh is file with content

$cat script.sh

count=1
while [ $count -lt 4 ]
do
echo"iteration: $count"
count=`expr $count + 1`
done

$./script.sh

iteration: 1
iteration: 2
iteration: 3

$sh -x ./script.sh

iteration: 1
iteration: 2
iteration: 3


+ count=1
+ [ 1 -lt 4 ]
+ iteration: $count"
iteration: 1
+ count=2
+ [ 2 -lt 4 ]

+ iteration: $count"
iteration: 2
+ count=3
+ [ 3 -lt 4 ]
+ iteration: $count"
iteration: 3
+ count=3
+ [ 4 -lt 4 ]


For Child Scripts ::


The actual problem is here ...

As I mentioned debugging can be done by specifying sh -x while executing the script but the same will not work in case any child script gets called by other.

In this case the same action can be performed by specifying set -x after she-band statement in child script





Compiling Java Programs (csh)



The basic requirement to compile and execute java programs on unix is to have jdk installed on box..

If  available, you would be able to see the setup file named setup.csh at /usr/usc/jdk/<version num>/

U need to run the setup file.For this add this to .login.

If [ -f /usr/usc/jdk/1.4.2/setup.csh ]
then
source /usr/usc/jdk/1.4.2/setup.csh
fi

Here source is used to call the file in current shell.
Once after this u can compile ur java program something like

%javac program.java

It creates a .class file with same name of ur java program i.e. program.class
In order to execute already compiled program.java following cmd can be used.

%java program

Script to Disable Cntrl+C on unix box



Some of the examples using trap is already discussed in a separate post where numeric notation is used for different signals.

Below are the standard examples by which we can come to know the importance of trap in our day to day scenarios.

To disable the activity of Cntrl + C ...

trap "echo 'cntrl c is disabled'" INT

This can be placed in .profile or .login accordingly.

Script:

I tried this simple script to ignore cntrl signals like ctrl c and ctrl d

TrapCase()
{
echo "entered cntrl char"
Original Function
}

OriginalFunction()
{
echo "enter input"
trap 'TrapCase' 1 2 15
read value
echo "This is after trap is executed"
echo $value
}

#Calling basic function
OriginalFunction


Friday, March 14, 2014

OFS (Output Field Seperator) in AWK

Like wise IFS, we can define OFS (Output File Separator) also in awk to design the o/p....
Say abc.txt is a file contains below data..

$cat abc.txt
ccc#bbb#111
Nnn#hgh#ghs#hhsh

$awk -F'#' '{print $1 $2}' abc.txt
ccc bbb
Nnn hgh

Below is the example with OFS where output is formatted

$ awk -F'#' 'BEGIN{OFS="-"}{print $1,$2}' abc.txt
ccc-bbb
Nnn-hgh

Specifying a  , between columns is mandatory. OFS will be ignored in case " ," is not specified.

Wednesday, March 12, 2014

Unix Font Coloring

Font co-lour at terminal level can be changed with options of print.The same can be placed in .profile as well for convince.

print "\033[33m This is YELLOW \n"
print "\033[36m This is BLUE \n"
print "\033[35m This is PINK \n"

To get back to the normal test ...

print "\033[m This is Normal Text\n"

Examples:


$print "\033[33m This is YELLOW \n"
This is YELLOW
$
$print "\033[36m This is BLUE \n"
This is BLUE
$
$print "\033[35m This is PINK \n"
This is PINK 
$
$
$print "\033[m This is Normal Text\n"
This is Normal Text
$

Friday, February 7, 2014

Trap with Examples :--

Trap:


Signals are software interrupts sent to a program to indicate that an important event has occurred. The events can vary from user requests to illegal memory access errors. Some signals, such as the interrupt signal, indicate that a user has asked the program to do something that is not in the usual flow of control.
The following are some of the more common signals you might encounter and want to use in your programs:
Signal NameSignal NumberDescription
SIGHUP1Hang up detected on controlling terminal or death of controlling process
SIGINT2Issued if the user sends an interrupt signal (Ctrl + C).
SIGQUIT3Issued if the user sends a quit signal (Ctrl + D).
SIGFPE8Issued if an illegal mathematical operation is attempted
SIGKILL9If a process gets this signal it must quit immediately and will not perform any clean-up operations
SIGALRM14Alarm Clock signal (used for timers)
SIGTERM15Software termination signal (sent by kill by default).

List of Signals:

There is an easy way to list down all the signals supported by your system. Just issue kill -l command and it would display all the supported signals:
The actual list of signals varies between Solaris, HP-UX, and Linux.

Sending Signals:

There are several methods of delivering signals to a program or script. One of the most common is for a user to type CONTROL-C or the INTERRUPT key while a script is executing.
When you press the Ctrl+C key a SIGINT is sent to the script and as per defined default action script terminates.
The other common method for delivering signals is to use the kill command whose syntax is as follows:
$ kill -signal pid
Here signal is either the number or name of the signal to deliver and pid is the process ID that the signal should be sent to. For Example:
$ kill -1 1001
Sends the HUP or hang-up signal to the program that is running with process ID 1001. To send a kill signal to the same process use the folloing command:
$ kill -9 1001
This would kill the process running with process ID 1001.

Cleaning Up Temporary Files:


$ trap "rm -f $WORKDIR/work1$$ $WORKDIR/dataout$$; exit" 2
$ trap "rm $WORKDIR/work1$$ $WORKDIR/dataout$$; exit" 1 2
$ trap 'rm $WORKDIR/work1$$ $WORKDIR/dataout$$; exit' 1 2
As an example of the trap command, the following shows how you can remove some files and then exit if someone tries to abort the program from the terminal:
$ trap "rm -f $WORKDIR/work1$$ $WORKDIR/dataout$$; exit" 2
From the point in the shell program that this trap is executed, the two files work1$$ and dataout$$ will be automatically removed if signal number 2 is received by the program.
So if the user interrupts execution of the program after this trap is executed, you can be assured that these two files will be cleaned up. The exit command that follows the rm is necessary because without it execution would continue in the program at the point that it left off when the signal was received.
Signal number 1 is generated for hangup: Either someone intentionally hangs up the line or the line gets accidentally disconnected.
You can modify the preceding trap to also remove the two specified files in this case by adding signal number 1 to the list of signals:
$ trap "rm $WORKDIR/work1$$ $WORKDIR/dataout$$; exit" 1 2
Now these files will be removed if the line gets hung up or if the Ctrl+C key gets pressed.
The commands specified to trap must be enclosed in quotes if they contain more than one command. Also note that the shell scans the command line at the time that the trap command gets executed and also again when one of the listed signals is received.
So in the preceding example, the value of WORKDIR and $$ will be substituted at the time that the trap command is executed. If you wanted this substitution to occur at the time that either signal 1 or 2 was received you can put the commands inside single quotes:
$ trap 'rm $WORKDIR/work1$$ $WORKDIR/dataout$$; exit' 1 2

Ignoring Signals:

If the command listed for trap is null, the specified signal will be ignored when received. For example, the command:
$ trap '' 2
Specifies that the interrupt signal is to be ignored. You might want to ignore certain signals when performing some operation that you don't want interrupted. You can specify multiple signals to be ignored as follows:
$ trap '' 1 2 3 15
Note that the first argument must be specified for a signal to be ignored and is not equivalent to writing the following, which has a separate meaning of its own:
$ trap  2
If you ignore a signal, all subshells also ignore that signal. However, if you specify an action to be taken on receipt of a signal, all subshells will still take the default action on receipt of that signal.

Resetting Traps:

After you've changed the default action to be taken on receipt of a signal, you can change it back again with trap if you simply omit the first argument; so
$ trap 1 2
resets the action to be taken on receipt of signals 1 or 2 back to the default.

Tuesday, February 21, 2012

File Viewing

Below are the commands used to view the contents of a file, soft link or of a hard link without opening

(Explanations for these links are explained in a separate post ).


i) cat

cat filename

ii) more

more filename

iii) less

 less filename

(Note: Give Enter and Q accordingly to continue and terminate the execution)


Difference B/W more and less:


more just prints the text as is, stopping for page breaks, and does not clear the screen, it can be back grounded but it doesn't clear the screen where 


less is a full-screen application that gives you a searchable, scrollable window and clears the screen after you exit.

Difference B/W more and cat:

 more can be used to work with the standard output but not cat

For example:

We can execute 

ls -lrt | more 

but not ls -lrt|cat

Friday, May 6, 2011

Cut with Examples

Cut is a command to extract portion of text from a file or from the given text.

Below are some of the examples based on a file test.txt whose contents are as below.

$ cat test.txt
cat command for file oriented operations.
cp command for copy files or directories.
ls command to list out files and directories with its attributes.
--------------------------------------------

1. Select Column of Characters

To extract only a desired column(in this case column2) from a file use -c option.

$ cut -c2 test.txt
a
p
s

--------------------------------------------

2. Select Column of Characters using Range

The following example extracts first 3 characters of each line.

$ cut -c1-3 test.txt
cat
cp
ls

----------------------------------------

3. Select Column of Characters using either Start or End Position

The following specifies only the start position before the ‘-’. This example extracts from 3rd character to end of each line from test.txt file.

$ cut -c3- test.txt
t command for file oriented operations.
 command for copy files or directories.
 command to list out files and directories with its attributes.
The following specifies only the end position after the ‘-’. This example extracts 8 characters from the beginning of each line from test.txt file.

$ cut -c-8 test.txt
cat comm
cp comma
ls comma
The entire line would get printed when you don’t specify a number before or after the ‘-’ as shown below.

$ cut -c- test.txt
cat command for file oriented operations.
cp command for copy files or directories.
ls command to list out files and directories with its attributes.
------------------------------------------
4. Select a Specific Field from a File

Instead of selecting x number of characters, if you like to extract a whole field, you can combine option -f and -d.

The following example displays only first field of each lines from /etc/passwd file using the field delimiter : (colon).

$ cut -d':' -f1 /etc/passwd
root
daemon
bin
sys
sync
games
bala
----------------------------------------
5. Select Multiple Fields from a File

$ grep "/bin/bash" /etc/passwd | cut -d':' -f1,6
root:/root
bala:/home/bala
To display the range of fields specify start field and end field as shown below. In this example, we are selecting field 1 through 4, 6 and 7

$ grep "/bin/bash" /etc/passwd | cut -d':' -f1-4,6,7
root:x:0:0:/root:/bin/bash
bala:x:1000:1000:/home/bala:/bin/bash

------------------------------------------

6. Select Fields Only When a Line Contains the Delimiter

In our /etc/passwd example, if you pass a different delimiter other than : (colon), cut will just display the whole line.

In the following example, we’ve specified the delimiter as | (pipe), and cut command simply displays the whole line, even when it doesn’t find any line that has | (pipe) as delimiter.

$ grep "/bin/bash" /etc/passwd | cut -d'|' -f1
root:x:0:0:root:/root:/bin/bash
bala:x:1000:1000:bala,,,:/home/bala:/bin/bash
But, it is possible to filter and display only the lines that contains the specified delimiter using -s option.

The following example doesn’t display any output, as the cut command didn’t find any lines that has | (pipe) as delimiter in the /etc/passwd file.

$ grep "/bin/bash" /etc/passwd | cut -d'|' -s -f1
------------------------------------------

7. Complement :

Select All Fields Except the Specified Fields

using –complement option.

The following example displays all the fields from /etc/passwd file except field 7

$ grep "/bin/bash" /etc/passwd | cut -d':' --complement -s -f7
root:x:0:0:root:/root
bala:x:1000:1000:bala,,,:/home/bala

--------------------------------------

8. Output Delimiter:

To change the output delimiter use the option –output-delimiter as shown below. In this example, the input delimiter is : (colon), but the output delimiter is # (hash).

$ grep "/bin/bash" /etc/passwd | cut -d':' -s -f1,6,7 --output-delimiter='#'
root#/root#/bin/bash
bala#/home/bala#/bin/bash
-----------------------------------------

9. Change Output Delimiter to Newline

In this example, each and every field of the cut command output is displayed in a separate line. We still used –output-delimiter, but the value is $’\n’ which indicates that we should add a newline as the output delimiter.

$ grep bala /etc/passwd | cut -d':' -f1,6,7 --output-delimiter=$'\n'
bala
/home/bala
/bin/bash
----------------------------------------

10. Example of combining Cut with Other Unix Commands

$ ps axu | grep python | sed 's/\s\+/ /g' | cut -d' ' -f2,11-
2231 /usr/bin/python /usr/lib/unity-lens-video/unity-lens-video
2311 /usr/bin/python /usr/lib/unity-scope-video-remote/unity-scope-video-remote
2414 /usr/bin/python /usr/lib/ubuntuone-client/ubuntuone-syncdaemon
2463 /usr/bin/python /usr/lib/system-service/system-service-d
3274 grep --color=auto python

Saturday, February 6, 2010

AWK with Examples

BASIC s ....

Lets take an example of a file abc.txt whose contents are 


$cat abc.txt

10 100 1000 one hundred
20 200 2000 two twohundred
30 300 3000 three threehundred


To print the first column


$awk '{print $1}' abc.txt
10
20
30

To print the first and second column

$awk '{print $1 $2}' abc.txt
10 100
20 200
30 300


By default space is the delimiter.You can change by flag -F like



$awk -OFS":" '{print $1 $4}' abc.txt

10:one
20:two
30:three

FLAGS:

NR                         Line Number
IFS                         Input File Seperator
OFS                       OutPut File Seperator

To count the no of  rows of a file(to replace wc -l abc.txt)

$awk 'BEGIN {count=0} {count=count+1} END {print count}' abc.txt
3

(OR)

simply

$ awk 'END {print NR}' abc.txt
3

To count the sum of  nth clolumns of a file(say 2 in this case)

$awk 'BEGIN {sum=0} {sum=sum+$2} END {print sum}' abc.txt
600

Lets go to some Advanced ...

Reading of input and printing can be managed by file seperators which is explained in a seperate post.

Loops, control statements, file handling and varible pasing can also be done in awk as explained below

<UNDER CONSTRUCTION> :)

Sunday, February 8, 2009

Redirections, Standard I/O


BasicS  ...


Most of the cases we use this redirection operators in shell script or at prompt level.

./scriptname.sh > logfile         To send the standard output of ./scriptname by replacing the existing logfile

./scriptname.sh >> logfile     To send the standard output of ./scriptname by appending the existing logfile

./scriptname.sh 2> logfile       To send the standard error while running the scriptname.sh

./scriptname.sh 2>> logfile    To send the standard error while running the scriptname.sh with append

Lets go to some advanced...

To send standard error as well as standard output to logfile

./scriptname.sh  2> logfile >&1    

(OR)

./scriptname.sh  > logfile 2>&1