Thursday, September 29, 2016

To find locks in Oracle Database

select * from dba_locks;

select * from dba_locks where blocking_others!='Not Blocking';

Wednesday, September 28, 2016

Price list setup Query

SELECT h.DESCRIPTION,
  h.CURRENCY_CODE,
  h.ACTIVE_FLAG,
  h.LAST_UPDATE_DATE,
  h.ROUNDING_FACTOR,
  h.FREIGHT_TERMS_CODE,
  h.PTE_CODE,
  h.COMMENTS,
  L.LIST_PRICE_UOM_CODE,
  L.PRIMARY_UOM_FLAG,
  L.LIST_PRICE,
  L.LIST_LINE_TYPE_CODE,
  L.OPERAND ,
  L. START_DATE_ACTIVE,
  L.PRODUCT_PRECEDENCE PRODUCT_VALUE ,
  L.ARITHMETIC_OPERATOR APPLICATION_METHOD,
  L. ATTRIBUTE1 INITIAL_PRICE,
  L.ATTRIBUTE2 INCREMENTS,
  L.ATTRIBUTE3 RETAIL_PRICE,
  L. ATTRIBUTE4 CLSS,
  L.ATTRIBUTE5 DIVISION,
  L.ATTRIBUTE6 DEPARTMENT,
  L. ATTRIBUTE7 VENDOR_SLOT,
  L. ATTRIBUTE8 COUNTRY_OF_ORIGIN,
  L. ATTRIBUTE9 LABEL_NAME1,
  L. ATTRIBUTE10 LABEL_NAME2,
  L. ATTRIBUTE14 PR_NUMBER1,
  L.ATTRIBUTE15 PR_NUMBER2,
  F.NAME,
  I.DESCRIPTION product_description,
  I.SEGMENT1
  ||'.'
  ||I.SEGMENT2 PRODUCT_VALUE,
  A.PRODUCT_ATTRIBUTE_CONTEXT PRODUCT_CONTEXT,
  A.PRODUCT_ATTRIBUTE PRODUCT_ATTRIBUTE,
  A.PRODUCT_ATTR_VALUE PRODUCT_VALUE,
  A.PRICING_ATTRIBUTE_CONTEXT PRICING_CONTEXT,
  A.PRICING_ATTRIBUTE,
  A.COMPARISON_OPERATOR_CODE OPERATOR,
  A.PRICING_ATTR_VALUE_FROM VALUE_FROM,
  A.PRICING_ATTR_VALUE_TO VALUE_TO
FROM qp_list_headers h,
  QP_LIST_LINES L,
  QP_PRICE_FORMULAS_TL F,
  mtl_system_items_b I,
  QP_PRICING_ATTRIBUTES A
WHERE H.LIST_HEADER_ID               =L.LIST_HEADER_ID
AND L.GENERATE_USING_FORMULA_ID      =F.PRICE_FORMULA_ID
AND TO_NUMBER (A.product_attr_value) = I.inventory_item_id
AND TO_NUMBER(A.LIST_LINE_ID)        =L.LIST_LINE_ID
AND I.organization_id                =
  (SELECT qp_util.get_item_validation_org FROM DUAL
  ); 

Wednesday, September 7, 2016

Import Data from csv to DB Table through ADF page

Import From Excel Into an ADF Table


ADF 11g provides an “export to excel” feature that allows you to take data from an ADF table into an excel sheet. But what about the other way around. Sometimes you need to work with excel sheets or comma separated values, and load the content of such a file into a database table. You could use the external table feature of the Oracle Database for that, or you just use any tool to load the csv contents into a database table. The use case I have is that the data in the csv needs to be validated and changed before it is committed into the database. In this post I show you how to load the csv contents into an ADF table component so you can edit the data before committing
Setting up the application
This post is based on a simple table and on a simple csv file. The Staff table definition is here:
1
2
3
4
5
6
7
8
CREATE TABLE ALS_STAFF
(
ID NUMBER(3,0),
FIRST_NAME VARCHAR2(20 BYTE),
LAST_NAME VARCHAR2(30 BYTE),
EMAIL_ADDRESS VARCHAR2(100 BYTE),
CONSTRAINT ALS_STF_PK PRIMARY KEY (ID) ENABLE
)
And the csv file
1
2
3
4
5
ID,FRST_NAME, LAST_NAME,EMAIL
100,Peter,Jones,testmail@mycompany.org
101,John,Janssen,a-testmail@mycompany.org
102,Thomas,Higgins,b-testmail@mycompany.org
.............
First step in the process is the creation of Business Components for the ALS_STAFF table.
Next I created a TaskFlow, a PageFragment, and a Page in which I implement the csvLoader functionallity.
Next I drop the AlsStaff collection from the datacontrol on the page as an ADF table component. This table is the one that I want to load with the data form a CSV file. To refresh the table after the file is loaded, I bind the table component so I can add it as a partialTartget. To upload the file I use an ADF inputfile component. This will open a dialog for me to select the file. Finally I add a commit button in order to actually save the table contents to the database.
The ’empty’ page now looks like this:
Setting up CSV2ADF
After uploading the file, I need it to be processed immediately, so I set the autosubmit property of the inputFile component to true and I add a valueChangeListener.
1
2
3
4
5
6
7
8
9
10
public void fileUploaded(ValueChangeEvent valueChangeEvent) {
// Add event code here...
UploadedFile file = (UploadedFile)valueChangeEvent.getNewValue();
try {
parseFile(file.getInputStream());
AdfFacesContext.getCurrentInstance().addPartialTarget(staffTable);
} catch (IOException e) {
// TODO
}
}
In this valueChangeListener I call the a method (parseFile()) which takes care of my CSV2ADF transformation. Here is the step by step guide.
First I create a new BufferReader that I use to read the file line by line.
Next I have to find the iterator associated with the component in which I want to lead the contents of the file.
The table is bound, so I can find the iterator via (CollectionModel)staffTable.getValue() and (JUCtrlHierBinding)_tableModel.getWrappedData().
Now we really can get started processing the file line by line. I assume that the first line of the file contains header information, so I always skip that line. For every line in the file I create a new row in the table. Then I use a StringTokenizer that splits the line after every comma so every line is processed token by token. Every attribute in the row now can be assigned the value of the corresponding token. Any exception are handled and communicated back to the user.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
public void parseFile(java.io.InputStream file) {
BufferedReader reader = new BufferedReader(new InputStreamReader(file));
String strLine = "";
StringTokenizer st = null;
int lineNumber = 0, tokenNumber = 0;
Row rw = null;
 
CollectionModel _tableModel = (CollectionModel)staffTable.getValue();
//the ADF object that implements the CollectionModel is JUCtrlHierBinding. It
//is wrapped by the CollectionModel API
JUCtrlHierBinding _adfTableBinding =
(JUCtrlHierBinding)_tableModel.getWrappedData();
//Acess the ADF iterator binding that is used with ADF table binding
DCIteratorBinding it =
_adfTableBinding.getDCIteratorBinding();
 
//read comma separated file line by line
try
{
while ((strLine = reader.readLine()) != null)
{
lineNumber++;
// create a new row skip the header (header has linenumber 1)
if (lineNumber>1) {
rw = it.getNavigatableRowIterator().createRow();
rw.setNewRowState(Row.STATUS_INITIALIZED);
it.getNavigatableRowIterator().insertRow(rw);
}
 
//break comma separated line using ","
st = new StringTokenizer(strLine, ",");
while (st.hasMoreTokens())
{
//display csv values
tokenNumber++;
 
String theToken = st.nextToken();
System.out.println("Line # " + lineNumber + ", Token # " +
tokenNumber +
", Token : " + theToken);
if (lineNumber>1){
// set Attribute Values
switch (tokenNumber) {
case 1: rw.setAttribute("Id", theToken);
case 2: rw.setAttribute("FirstName", theToken);
case 3: rw.setAttribute("LastName", theToken);
case 4: rw.setAttribute("EmailAddress", theToken);
}
}
}
//reset token number
tokenNumber = 0;
}
}
catch (IOException e) {
// TODO add more
FacesContext fctx = FacesContext.getCurrentInstance();
fctx.addMessage(staffTable.getClientId(fctx), new FacesMessage(FacesMessage.SEVERITY_ERROR,
"Content Error in Uploaded file", e.getMessage()));
}
catch (Exception e) {
FacesContext fctx = FacesContext.getCurrentInstance();
fctx.addMessage( null, new FacesMessage(FacesMessage.SEVERITY_ERROR,
"Data Error in Uploaded file", e.getMessage()));
}
}
When I run the application, I can select a file to upload, the content of the file is immediately displayed in the table, I can edit the data, and once I’m ready I can commit the data to the database by hitting the commit button.
This example just loads the content of a pre-defined csv file into a predefined ADF table component in order to save the content to a database table. A possible next step could be to load any csv file into an ADF table component that is created at runtime.