Search This Blog

Wednesday, July 17, 2013

Error getting Cluster Config Set URLs for uri (JDeveloper 10.1.3)

The error Error getting Cluster Config Set URLs for uri appeared when I was created Application Server Connection from JDeveloper. Application Server version 10.x. Please note that, this is a clustered environment.




The following workaround solution worked for me.

I used the IP address instead of the server name in the Host Name filed. It's worth looking at /etc/host entries to for the IP address. Retested the connection and getting this. I ignored this error. As long as you can use this Application Server Connection to create an Integration Server connection, its ok to ignore... worked for me  :-)


I find these links very helpful. Mentions some other workaround...

Thursday, July 4, 2013

Connection issue: No route to host

A nice step by step investigation and solution on this problem from OTN. Wanted to share... :)

------------------------
Hi all,

I have a problem with my BPEL process (vers. 10.1.3.4).
I am trying to call an external web service and I get the following error (I can access the web service just using its URL):

exception on JaxRpc invoke: HTTP transport error: javax.xml.soap.SOAPException: java.security.PrivilegedActionException: javax.xml.soap.SOAPException: Message send failed: No route to host

I have tried pinging the remote server with success, so I should be able to connect (no proxy issue or something like that).

I have tried looking in the BPEL logs found at:
<oracle home>/opmn/logs
<oracle home>/bpel/system/logs
<oracle home>/bpel/domains/default/logs
This did not give anything useful either.

I deploy using JDeveloper. I have checked the proxy settings here - and there are none set.

Does anybody have any ideas as to what can be wrong?
Any help or hints will be much appreciated.

Regards,
Aagaard

------------------------

Hi Aagaard,

Maybe some open doors to kick in, but you say that you're able to ping the host of the webservice and can access it using it's url. Did you test it on you development machine or on the machine that is running Application Server? Since these need not be the same (often they're not, of course) it might be that you're server might have other routes thatn your development machine.

If the remote webservice is running on http://<webserver>:<port>/webservice I would open a terminal session on the server that's running your soasuite and not only do a ping but also a:
prompt>telnet <webserver> <port>
Note there is a space in stead of a colon between webserver and port.
This way you particularly connect to the webserver using on the given port. If there is no service listening on that port you would get a connection refused error.

Regards,
Martien

------------------------
Hi Martien,

Thank you for the swift reply.

Yes, I tried pinging from both development machine and from application server. Both successful.

Tried the telnet approach and interestingly enough that failed.

[oracle@prima logs]$ telnet csc2835139 18111
Trying 172.19.205.69...
telnet: connect to address 172.19.205.69: No route to host
telnet: Unable to connect to remote host: No route to host

So I guess that means that I can access the remote machine, but that no service is listening on the given port?
Even though I can access the web service via a web browser?

Regards,
Aagaard
------------------------
Correct Answer by MartienvandenAkker  on Oct 6, 2008 10:49 AM
 
Well it could also mean that from the server which is running your applicationserver the port 18111 is blocked. So I would check with your network-managers if this port is open from your applicationserver. They should check the firewalls (if you have no control on that yourself).

Often the situation at a customer is that your development pc is at one LAN while your (development, test, acceptance, production) application server is in a datacenter on a different sub-lan. With another fire-wall. From your development PC you need to access internet-http and maybe other protocols. But these are usually forbidden on the datacenter lan. You'll have a separate route from your development lan to the datacenter. Probably the server has 2 NICs: one for the internal network and one connected to the Demilitarized Zone. Within the DMZ probably a proxy/reverseproxy configuration routes the requests from outside to your application server and visa versa. So the proxy should also be checked.

This should normally the case at a customer-site, even when its a development environment. Because you don't want to have intruders hacking into your systems, even on development. And although you might not care, the dev-env should reflect your prod-env. So it should also use a dmz with a proxy/reverseproxy setup. If it is a setup on your laptop or your home lan your setup might be simpeler. But then again you might have to cope with different firewalls. If it is all on your laptop (using VMWare for example) I would try when temporarly shutdown all Firewalls.

You told that you did not have a proxy. Does that count for both your dev-pc as well as the application server?
------------------------
Hi Martien,

You were right. I did not have access to the port from the BPEL server.
Getting access solved my problem.

Thank you very much for the assist.

Cheers,
Aagaard
 
 

Tuesday, July 2, 2013

Back to Basics: Execute Immediate and SELECT statement

Execute Immediate statement prepares and executes the dynamic statement. For single row queries we can get the data using ‘INTO’ clause and store it in the variable. We can also use bind variable using ‘USING’ clause. Placeholder for the bind variables are defined as text, preceded by colon ( : ). In this blog post, we will show you how can we use Execute Immediate statement to retrieve the data. Following is the small PL/SQL stored procedure that demonstrates the use of Execute Immediate. Connect to the database via SQL*Plus using proper credentials and create the following procedure.

CREATE OR REPLACE PROCEDURE test_proc(p_object_Name IN VARCHAR2)
AS
v_object_Type user_objects.object_type%TYPE;
BEGIN
BEGIN
EXECUTE IMMEDIATE
‘SELECT object_Type ‘
|| ‘ FROM user_objects ‘
|| ‘ WHERE object_name = : obj_name ‘
INTO v_object_Type
USING p_object_Name;
dbms_output.put_line(‘Object Type is ‘ || v_object_Type);
EXCEPTION
WHEN NO_DATA_FOUND THEN
RAISE_APPLICATION_ERROR(-20950, ‘object does not exist: ‘ || p_object_name);
WHEN OTHERS THEN
RAISE_APPLICATION_ERROR(-20951, ‘Error retrieving oject name: ‘ || SQLERRM);
END;
END test_proc;
/
Once the procedure is successfully created, let us execute it to see the results.
SQL> exec test_proc(‘TEST_PROC’);
Object Type is PROCEDURE
PL/SQL procedure successfully completed.
Here we have used bind variable to improve the performance. It is used in ‘USING’ clause. During execution : obj_name placeholder value will be replaced with the value from p_object_name.
Couple of things we have to remember when dealing with dynamic sqls are
•  ;  is not placed at the end of the SELECT statement, but it is placed at the end of the Execute Immediate statement.
• Similar thing is also with INTO clause. INTO is outside of quoted SELECT statement.
As demonstrated above, Execute Immediate is the way to deal with dynamic sqls.

Source: http://decipherinfosys.wordpress.com/2008/09/11/back-to-basics-execute-immediate-and-select-statement/

Monday, May 6, 2013

Oracle SOA ESB: Quick start

The overall process and tasks required for designing the enterprise service bus are outlined in the following list. Links to step-by-step instructions are specified for each task:
  1. Start the ESB Server and open Oracle JDeveloper
    The key Oracle Enterprise Service Bus design components are the ESB Server, Oracle JDeveloper, and the database that serves as the ESB Metadata Server. For information about starting ESB components, see "Starting, Stopping, and Accessing Oracle Enterprise Service Bus Components".
  2. Create the necessary server and database connections.
    See "Creating and Testing Connections".
  3. Create the Oracle JDeveloper applications and projects
    See "Creating Applications and ESB Projects".
  4. Create ESB systems and, optionally, service groups.
    See "Creating ESB Systems and Service Groups"
  5. Add or import into an ESB project.
    See "Adding Project Content" and "Importing Files into a Project".
  6. Create the inbound and outbound services, or invoke external services.
    See Chapter 4, "Creating Inbound and Outbound Services".
  7. Create routing services and configure the routing rules to:
    • Apply conditional expressions to the message instance payload
    • Transform the structure of the message payload
    • Specify where a response in a request/response scenario be sent
    • Specify where faulted messages be sent
    See Chapter 5, "Creating Routing Services and Routing Rules".
  8. Register services with the ESB repository.
    See "Registering ESB Projects and Services with the ESB Server".
  9. Sync services with the ESB Server.
    See "Syncing Services From ESB Server".
If you no longer need a service, you can remove it. See "Deleting ESB Projects".

The text above is an extract from
Oracle® Enterprise Service Bus Developer's Guide
10g (10.1.3.3.0)
Part Number E10295-01
http://docs.oracle.com/cd/E11036_01/doc.1013/e10295/esb_jdev.htm

I find its a pretty short but handy overview. Hope it helps.

Monday, February 4, 2013

SSIS Tutorials with Screenshots

This tutorial of SSIS is very nice. Presented every section with screenshots..

http://www.mssqltips.com/sqlservertutorial/200/sql-server-integration-services-ssis/

Import MDF file into SQL Server Management Studio

1. Open SQL Management Studio Express and log in to the server to which you want to attach the database.
2. In the 'Object Explorer' window, right-click on the 'Databases' folder and select 'Attach...' The 'Attach Databases' window will open.
3. Inside that window click 'Add...' and then navigate to your .MDF file and click 'OK'.
4. Click 'OK' once more to finish attaching the database and you are done.

The database should be available for use.

NOTE: Pleate take a backup of the *.mdf file before you import. This is because, if you import an MDF file from SQL Server 2000 into SQL Server 2005, for example. There is no way you can reattach the file to the earlier version of SQL Server.

Thursday, January 31, 2013

Concurrent Program: phase_code, status_code meanings

Very old but still useful stuff...

status_code and phase_code in FND_CONCURRENT_REQUESTS table

PHASE_CODE column
C - Completed
I - Inactive
P - Pending
R - Running
 
STATUS_CODE Column:

A - Waiting
B - Resuming
C - Normal
D - Cancelled
E - Error
F - Scheduled
G - Warning
H - On Hold
I - Normal
M - No Manager
Q - Standby
R - Normal
S - Suspended
T - Terminating
U - Disabled
W - Paused
X - Terminated
Z - Waiting

Monday, January 28, 2013

OBIEE and ODI

OBIEE: The Oracle Business Intelligence Suite Enterprise Edition Plus (EE) is a comprehensive suite of enterprise BI products, delivering the full range of BI capabilities including interactive dashboards, full ad hoc, proactive intelligence and alerts, enterprise and financial reporting, real-time predictive intelligence, disconnected analytics, and more. It is featuring a unified, highly scalable, modern architecture, Oracle BI EE Plus provides intelligence and analytics from data spanning enterprise sources and applications—empowering the largest communities with complete and relevant insight.

In addition to providing the full gamut of BI functionality, the Oracle Business Intelligence Suite EE Plus platform is based on a proven, modern Web Services-Oriented Architecture that delivers true next-generation BI capabilities.

ODI: Oracle Data Integrator Enterprise Edition delivers unique next-generation, Extract Load and Transform (E-LT) technology that improves performance, reduces data integration costs, even across heterogeneous systems. Unlike conventional ETL tools, Oracle Data Integrator EE offers the productivity of a declarative design approach, as well as the benefits of an active integration platform for seamless batch and real-time integration. In addition, hot-pluggable Knowledge Modules provide modularity, flexibility, and extensibility.

 

Sunday, January 27, 2013

OLE DB - Microsoft API to access non-RDBMS sources

The title already summarizes what OLE DB is about. Let's have a look what Wikipedia has written about it.

OLE DB (Object Linking and Embedding, Database, sometimes written as OLEDB or OLE-DB), an API designed by Microsoft, allows accessing data from a variety of sources in a uniform manner. The API provides a set of interfaces implemented using the Component Object Model (COM); it is otherwise unrelated to OLE. Microsoft originally intended OLE DB as a higher-level replacement for, and successor to, ODBC, extending its feature set to support a wider variety of non-relational databases, such as object databases and spreadsheets that do not necessarily implement SQL.

http://en.wikipedia.org/wiki/OLE_DB

Tuesday, January 22, 2013

Data Mart vs OLAP Cubes (Traditional DB vs Multidimentional DB)

Data Mart and Cubes - In Short:
Data mart is a collection of data of a specific business process. It is irrelevant how the data is stored.

A cube stores data in a special way, multiple-dimension, unlike a table with row and column. A cube in an OLAP database is like a table in a traditional database.

A data mart can have tables or cubes. Cubes make the analysis faster because it pre-calculates aggregations ahead of time.


In Detail:
Cube is an OLAP artifact. OLAP cubes can use a different API to a standard relational database. Typically OLAP servers maintain their own optimized data structures (known as MOLAP), although they can be implemented as a front-end to a relational data source (known as ROLAP) or in various hybrid modes (known as HOLAP).

'Data Mart' is also a fairly loosely used term and can mean any user-facing data access medium for a data warehouse system. The definition MAY or MAY NOT include the reporting tools and metadata layers, reporting layer tables or other items such as Cubes or other analytic systems.

Traditionally Data Mart is considered as the database from which the reporting is done, particularly if it is a easily definable subsystem of the overall data warehouse architecture. However it is quite reasonable to think of it as the user facing reporting layer, particularly if there are ad-hoc reporting tools such as Business Objects or OLAP systems that allow end-users to get at the data directly.

http://stackoverflow.com/questions/360900/datamart-vs-reporting-cube-what-are-the-differences

Monday, January 14, 2013

How to examine OPP logs

From the application:

- Login to System Administrator Responsibility
- Navigate to: Concurrent > Manager > Administer
- Select the Output Post Processor
- Click on the Processes button
- Select the process that was active when the request ran
- Click on the Manager Log button to open the OPP log file

From the file system:
SQL> SELECT fcpp.concurrent_request_id req_id, fcp.node_name, fcp.logfile_name
FROM fnd_conc_pp_actions fcpp, fnd_concurrent_processes fcp
WHERE fcpp.processor_id = fcp.concurrent_process_id
AND fcpp.action_type = 6
AND fcpp.concurrent_request_id = <request_id>;
 
 
Credit goes to
Hussein Sawwan. Copied from his OTN post
https://kr.forums.oracle.com/forums/thread.jspa?threadID=646563

Thursday, January 10, 2013

EBS CP Parameters - small but useful stuff :)

Let's say you have 2 parameters in a EBS concurrent program (CP).

1 p_ledger_name
2. p_ledger_id

You want to set the value for the p_ledger_id based on the value choosen for p_ledger_name. You can do it this way.

SETP 1
=====
Set the p_ledger_name default parameter type = SQL Statement
SQL statement can be this which ensures the users get to choose only the ledgers they have access to.
select name from gl_ledgers where ledger_id = gl_access_set_security_pkg.get_default_ledger_id(:$PROFILES$.GL_ACCESS_SET_ID, 'R')

STEP 2
=====
p_ledger_id parameter should also be set as SQL Type and SQL can be as follows,
select gl_ledger_utils_pkg.get_ledger_id_of_short_name(:$FLEX$.GL_SRS_LEDGER_WITH_SETS_AND_ALCS_STORE_SHORT_NAME) FROM DUAL

Here GL_SRS_LEDGER_WITH_SETS_AND_ALCS_STORE_SHORT_NAME is the value set name of the p_ledger_name parameter (my first parameter). Now lets have a look at this Value Set.

This value set is a character type value set. Validation Type = Table. You can check the value set details by clicking Edit Information button.

STEP 3
=====
Save your CP and Test it. When you choose the ledger name in the first parameter, you will get the ledger id in the second parameter automatically. Cool!!